Job Description :
Job Description

Responsible for implementation and support of the Cloudera Hadoop environment.
Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration.
The administrator consultant will work closely with infrastructure, network, database, business intelligence and application teams to ensure business applications are highly available and performing within agreed on service levels.
Strong experience in implementing concepts of Hadoop eco system such as YARN, MapReduce, HDFS, Zookeeper, Kudu, Impala, Oozie and Hive.
Experience in Real-time Execution engines like Spark, Storm, Kafka.
Experience in setting up Ad/LDAP/Kerberos Authentication models
Experience in Sentry and database management.
Experience in Shell scripting and exposure to Python
Operational expertise in troubleshooting, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
Experience in Docker and containerized environment.
Experience in working in encrypted data at rest using java Keystore or HSM.
In charge of installing, administering, and supporting Windows and Linux operating systems in an enterprise environment.
Accountable for storage, performance tuning and volume management of Hadoop clusters and MapReduce routines.
monitoring tools like datadog, Ansible and others
Must have patching and upgrade experience
Setup, monitor, and maintain DR backups for the Hadoop Clusters
Manage and analyze Hadoop log files.
File system management and monitoring.
Develop and document best practices
HDFS support and maintenance and helping application developers with issues using the Hadoop Cluster
Setting up new Hadoop users.
Responsible for the new and existing administration of Hadoop infrastructure.
Include DBA Responsibilities like data modeling, design and implementation, software installation and configuration, database backup and recovery, database connectivity and security.

· Developer-BIG Data ( BDCloudera Hadoop-Impala/HDFS/Mahout/Flume/Scoop/Yarn – Expert (5+ years)