Job Description :
Responsibilities:



-Senior Developer/Lead for Analytics Solution using Hadoop (preferably CDH), Spark, and Kafka.

-Performance tuning of CDH or similar platforms

- Able to work independently in a fast-paced environment

- Experience with Agile implementation methodology

- Able to translate high level design into logical and physical detailed design 

- Build and incorporate automated unit tests and participate in integration testing efforts

- Work across teams to resolve operational and performance issues

- Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to


Minimum Requirements:

- 10+ yr overall IT experience in Software Development with 5+ years of experience in Big Data Analytics area. Must have delivered at least 4 big data projects to production.

- Experience in Hadoop distributions (Apache / Cloudera / Hortonworks) 

- Experience working with Big Data eco-system including tools such as Hadoop, Map Reduce, Yarn, Hive, Impala, Spark , Kafka, Sqoop and Storm to name a few 

- 1+ yrs of experience with techniques of performance optimization for both data loading and data retrieval 

- 2 yrs of experience writing Sparkql and/or SQL queries