Job Description :
Hi Folks,

I''m Santosh from Humacinc, I have a requirement for Hadoop admin for the location Seattle, WA
Please go through the JD mentioned below
share the resume to : Santosh @
Reach me at : 6 2 3 2 4 2 2 5 9 4

Position: Hadoop Admin with cloud Experience
Location : Seattle, WA

Hadoop Admin with cloud Experience



· Excellent understanding and knowledge of Hadoop architecture (1.x and 2.x) and various components such as HDFS, Yarn, Resource Manager, Node Manager, Name Node, Data Node and Map Reduce paradigms.

· Working knowledge and experience with wide range of big data components such as HDFS, Map Reduce, Sqoop, Flume, Pig, Hive, HBase, Spark, Impala, Kafka (confluent), Zookeeper, Oozie, Hue, Solr etc.

· Solution provider for Capacity Planning of a cluster and creating roadmaps for Hadoop cluster deployment.

· Experience in installation, configuration, supporting and managing Hadoop Clusters using Apache Hadoop, Cloudera Distribution Hadoop(CDH-5.x) distributions, Hortonworks Data Platform (HDP-2.x) and MapR both on on-premises and Cloud (AWS)

· Good working Knowledge of Hadoop security like Kerberos, LDAP, Sentry, ACL, Key Trustee Server, Key Management Server, Key-Value store etc

· Experienced in Cloudera installation, configuration, and deployment on Linux distribution

· Commissioning and Decommissioning of nodes as required

· Managing and monitoring Hadoop services like Name node, Data node & Yarn

· Experienced in loading data into the cluster from dynamically-generated files using Flume and RDBMS using Sqoop also from the local file system to the Hadoop cluster

· Performance Tuning, cluster monitoring, and solving Hadoop issues using CLI or by WebUI

· Troubleshooting Hadoop cluster runtime errors and ensuring that they do not occur again

· Accountable for storage and volume management of Hadoop clusters.

· Ensuring that the Hadoop cluster is up and running all the time (High availability, cluster resource utilization etc

· Evaluation of Hadoop infrastructure requirements and design/deploy solutions

· Backup and recovery task by Creating Snapshots Policies, Backup Schedules, and recovery from node failure

· Working experience in Installation of various components and daemons of Hadoop eco-system

· Responsible for Configuring Alerts for different types of services which are running in Hadoop Ecosystem

· Communicate and escalate issues appropriately. Manage and review Hadoop log files and Log cases with Cloudera.

· Collaborating with application teams to install the operating system and Hadoop updates, patches, version upgrades when required.

Client : Infosys