Job Description :
Job Responsibilities:
Hadoop development and implementation (Environment - HDFS, Hbase, Spark, Kafka, Ozie, Scoop, Flume, Kerberos, Oracle ASO, MySQL, GeoSpatial data)
Loading from disparate data sets using Hadoop stack of ingestion and workflow tools
Pre-processing using Hive and Pig.
Designing, building, installing, configuring and supporting Hadoop.
Translate complex functional and technical requirements into detailed design.
Perform analysis of vast data stores and uncover insights.
Maintain security and data privacy.
Managing and deploying HBase.
Being a part of a POC effort to help build new Hadoop clusters.
Test prototypes and oversee handover to operational teams.
Propose best practices/standards.
Configure and implementation of Data Marts in Hadoop platform
Required Qualifications:
Need Expert Level Hadoop Developer with 5+ Years'' of experience in - BIG Data ( BDApache Hadoop (HDFS) - Base/Hive/Pig/Mahout/Flume/Scoop/MapReduce/Yarn
Hands on experience with a NoSQL database like HBase
Hands on experience in job scheduling tools like Oozie
Hands on experience in Pig and Hive Queries as well as performance tuning
Hands on experience in UNIX and Shell Scripting
Experience working with Spark for data manipulation, preparation, cleansing
             

Similar Jobs you may be interested in ..