Job Description :
The ideal candidate must have experience of at least 3+ years as Hadoop Admin and 2+ years as Linux
Administrator with proven hands-on experience in installation, configuration, supporting and managing
Clusters with Horton works and Cloudera.
Role & Responsibilities performed:
Installing Hadoop clusters using different distributions of Cloudera and Hortonworks.
Responsible for Cluster maintenance, Monitoring, commissioning and decommissioning Data nodes,
Troubleshooting, manage and review data backups, manage &review log files.
Architecture design and implementation of deployment, configuration management, backup, and
disaster recovery systems and procedures.
Implement statistical data quality procedures on new data sources, and by applying rigorous iterative
data analytics, supports Data Scientists in data sourcing and preparation to visualize data and synthesize
insights of commercial value
Define data requirements, gather and mine large scale of structured and unstructured data, and validate
data by running various data tools in the Big Data Environment
Support the standardization, customization and ad-hoc data analysis, and will develop the mechanisms
to ingest, analyze, validate, normalize and clean data
Day to day responsibilities includes solving developer issues, deployments moving code from one
environment to other environment, providing access to new users and providing instant solutions to
reduce the impact and documenting the same and preventing future issues.
Extensive experience in creating Roles, Users and providing privileges to roles and user management.
Managing the Hadoop infrastructure with Cloudera Manager and Ambari.
Commissioning and Decommissioning the nodes in Hadoop Cluster
Work on setting up Name Node high availability for major production cluster and designed automatic
failover control
Experience in Importing and exporting data from different databases like MySQL into HDFS using
Sqoop.
Expertise in cluster upgradation and patch upgrade without any data loss and with proper backup plans.
Provide security and authentication with ranger where ranger admin provides administration and user
sync adds the new users to the cluster.
Setup flume for different sources to bring the log messages from outside to Hadoop HDFS.

Manage and review Log files as a part of administration for troubleshooting purposes. Communicate and
escalate issues appropriately.
Analyzing system failures, identifying root causes, and recommended course of actions.
Work with Big Data Policy and Security teams and Legal to create data policy and develop interfaces
and retention models which requires synthesizing or anonymizing data
Develop and maintain data engineering best practices and contribute to Insights on data analytics and
visualization concepts, methods and techniques.
Technical Expertise required for this Position
Datawarehouse skills like SQL, Data modelling and data profiling
Analytics Tools like SAS, R, Python, Scala, data science & Machine Learning libraries
Importantly, the position requires:
Self-motivated with a willingness to learn with a strong sense of accountability and a proven track record of
delivering results.
Great communicator with strong relationship and interpersonal skills.
Ability to multi-task, work under pressure, meet deadlines and thrive in a fast-paced setting.
Strong interpersonal skills including mentoring, coaching, collaborating, and team building. Strong analytical,
planning, and organizational skills with an ability to efficiently manage competing demands.
             

Similar Jobs you may be interested in ..