27 Feb
Hadoop Administrator
Georgia, Atlanta , 30309 Atlanta USA

Vacancy expired!

Responsibilities will include providing operational support on Hadoop for data science, business intelligence and ETL/streaming workloads, running on Cloudera Data Platform (CDP), Hortonworks Data Platform (HDP) and Hortonworks Data Flow (HDF). Operational support responsibilities will include platform installation/upgrade, troubleshooting, documentation, performance tuning, root-cause analysis, and issue resolution. The position will also involve management of security and change control for HDP and HDF. The position will work closely with Hadoop Team Leads, other Hadoop Administrators, Data Engineers, Application Developers and Data Scientists. This position will involve participation in on-call rotation for 24/7 support.

Responsibilities
  • Work directly with Norfolk Southern technical and business resources to devise and recommend solutions based on the understood requirements
  • Analyze complex distributed production deployments and make recommendations to optimize performance
  • Install and configure new Cloudera Data Platform (CDP) clusters
  • Maintain and patch existing CDP/HDP/HDF clusters
  • Analyze and apply SmartSense recommendations
  • Work with development and business stakeholders to setup new CDP/HDP/HDF users. This includes setting up Kerberos principals and testing HDFS, Hive, HBase and Yarn access for the new users.
  • Perform CDP/HDP security configurations with Ranger, Kerberos, Knox and Atlas
  • Work closely with vendor support to address support tickets
  • Monitor and optimize cluster utilization, performance and operations
  • Write and produce technical documentation, administration runbooks and knowledgebase
  • Keep current with Hadoop Big Data ecosystem technologies
  • Participate in oncall rotation.

Qualifications
  • Hortonworks HDP Administration / Cloudera Certified Administrator Certifications
  • Demonstrated experience with implementing big data use cases and understanding of standard design patterns commonly used in Hadoop-based deployments
  • At least two (2) years of HDP/CDP installation and administration experience in multitenant production environments with experience on HDP 2.6.x – 3.1x/HDF 3.x/CDP 7.1x versions
  • Experience designing and deploying production large-scale Hadoop architectures
  • Strong experience implementing software and/or solutions in enterprise Linux or Unix environments, including strong experience in shell scripting.
  • Strong experience with various enterprise security solutions such as LDAP and Active Directory
  • Strong experience with Ambari, Ranger, Kerberos, Knox and Atlas
  • Strong experience with Hive and/or Impala
  • Sound experience with other RDBMS(es), such as Oracle or MS SQL server.
  • Sound experience with source code control methodology and real systems (Github).
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations
  • Strong understanding of Java development, debugging & profiling
  • Good troubleshooting skills, understanding of CDP/HDP capacity, bottlenecks, memory utilization, CPU usage, OS, storage, and networks
  • Understanding of on premise and Cloud network architectures
  • Excellent verbal and written communication skills

Vacancy expired!


Report job