Job Description :
Job Details: Must Have Skills (Top 3 technical skills only)
* 1. Hadoop MapReduce, PIG, HIVE, SQOOP, HBase, SPARK, Scala, Impala.
2. Nagios, Ganglia, Chef, Puppet and others.
3. Scale Data Hadoop Environment. Nice to have skills (Top 2 only)
1. Hadoop Cluster Security implementation such as Kerberose, Knox & Sentry.
2. Cluster Design, Configuration, Installation, Patching, Upgrading and support on High Availability. Detailed Job
Description: Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education.
At least 4 years of experience in DWBI of which minimum 2 years in Hadoop Administration in Cloudera and 1 year as Hadoop MapReduce, PIG, HIVE, SQOOP, HBase, SPARK, Scala, Impala.At least 1 year in monitoring tools like Nagios, Ganglia, Chef, Puppet and others.
Should have experience with Cloudera Manager OR Ambari OR Pivotal Command Line Centre. Should be Desired years of experience*:
Above 5 years Education/ Certifications (Required): Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experi Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
1. Should have experience in Monitoring Hadoop cluster connectivity and security
2. Should have experience in HDFS support and maintenance
3. Must have performance tuning experience