Job Description :
LOCATION: 1 FOR CHARLOTTE, NC
1 FOR PHOENIX, AZ
"Description
Looking for Hadoop Administrator with 2 to 6 years of experience to join our growing Hadoop Production Support team. The hire will be responsible for supporting MAPR Hadoop platform and application. The Hadoop Admin will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure Hadoop Platform runs 24/7.

Responsibilities:
Install and configure MAPR/Hortonworks clusters
Apply proper architecture guidelines to ensure highly available services
Plan and execute major platform software and operating system upgrades and maintenance across physical environments
Develop and automate processes for maintenance of the environment
Implement security measures for all aspects of the cluster (SSL, disk encryption, role-based access via Apache Ranger policies)
Ensure proper resource utilization between the different development teams and processes
Design and implement a toolset that simplifies provisioning and support of a large cluster environment
Review performance stats and query execution/explain plans; recommend changes for tuning Apache Hive queries
Create and maintain detailed, up-to-date technical documentation

Requirements:
3 years of experience working with Apache Hadoop as an admin
In-depth knowledge of Apache Hadoop and MapReduce
Experience with Apache HBase and Hive
Experience with Linux
Ability to shell script with Linux
Ability to troubleshoot problems and quickly resolve issues
Cluster maintenance as well as creation and removal of nodes.
Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
Screen Hadoop cluster job performances and capacity planning
Monitor Hadoop cluster connectivity and security
Manage and review Hadoop log files.
File system management and monitoring.
HDFS support and maintenance.
Experience with big data tools: Hadoop, Hive, HBase, Pig Latin, Spark, Kafka, NiFi etc.
In-depth knowledge of Apache Hadoop and MapReduce
Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required"
             

Similar Jobs you may be interested in ..