Job Description :
Hadoop Lead Developer - Overland Park, KS

JOB DESCRIPTION:

Experience leading development team and direct client facing experience.
6+ years’ experience working with Big Data eco-system including tools
6+ such as Hadoop, Map Reduce, Yarn, Hive/Impala/Presto, Pig, Spark,
6+ Kafka, Sqoop, Hue, and Storm is preferred
Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs Understands the tradeoffs between different approaches to Hadoop file design Experience with techniques of performance optimization for both data loading and data retrieval Experience with NoSQL Databases – HBase, Apache Phoenix, Apache Cassandra, Vertica, or MongoDB Able to translate business requirements into logical and physical file structure design Ability to build and test solution in rapid and iterative manner Ability to articulate reasons behind the design choices being made Designing, building, installing, configuring and supporting Hadoop.
Translate complex functional and technical requirements into detailed design.
Perform analysis of vast data stores and uncover insights.
Maintain security and data privacy.
Good Knowledge on DW and BI
help build new Hadoop clusters or Manage the existing one
Ability to write MapReduce jobs

Strong technical expertise in most of the following:

Hadoop (Hortonworks distribution)
Talend for Big Data
Apache Hive
Apache Phoenix
Apache Spark
Apache Hbase
Apache Sqoop
Apache Hue
Kafka
SQL and NO SQL Data stores
Linux
Strong communication skills, both written and oral Excellent teamwork and interpersonal skills Potential and ability to lead small engagements or work streams within large engagements Aptitude for trouble-shooting and problem-solving Strong technical skills including understanding of software development principles Hands-on programming experience

Hadoop development and implementation.
Loading from disparate data sets.
Pre-processing using Hive and Pig.
Designing, building, installing, configuring and supporting Hadoop.
Translate complex functional and technical requirements into detailed design.
Perform analysis of vast data stores and uncover insights.
Maintain security and data privacy.
Good Knowledge on DW and BI
help build new Hadoop clusters or Manage the existing one
Ability to write MapReduce jobs

Client : KAnand