Job Description :
"o Responsible for implementation and support of the Enterprise Hadoop environment.
o Involves designing, capacity planning, cluster set up, monitoring, structure planning, scaling and administration of Hadoop components YARN, MapReduce, HDFS, HBase, Zookeeper, Storm, Kafka, Spark, Pig and Hive)
o Work closely with infrastructure, network, database, business intelligence and application teams to ensure business applications are highly available and performing within agreed on service levels.
o Accountable for performance tuning and resource management of Hadoop clusters and MapReduce routines.
o Strong Experience with LINUX based systems & scripting (either of Shell, Perl or Python
o Experience with configuration management tools like puppet, chef or salt.
o Strong Experience with Configuring Security in Hadoop using Kerberos or PAM.
o Good knowledge of directory services like LDAP & ADS and Monitoring tools like Nagios or Icinga2.
o Strong troubleshooting skills of Hive, Pig, Hbase and JAVA Mapreduce codes/jobs.
o Evaluate technical aspects of any change requests pertaining to the Cluster.
o Research, identify and recommend technical and operational improvements resulting in improved reliability efficiencies in developing the Cluster.
"
             

Similar Jobs you may be interested in ..