Job Description :
Title:Application Developer with ETL, Hadoop and Bigdata
Location:Chevy Chase, MD
Type/Duration: 6 month temp-to-hire

Need Candidates on W2

JD:
The selected candidate will be part of a group of ETL and report developers responsible for creating data marts used for Enterprise reporting and data visualizations. The selected candidate will maintain and improve existing ETL streams as well as create and optimize new ETL streams on our Big Data platform.

Experience on continuous build and test process using tools such as Maven and Jenkins Experience with monitoring tools such as Splunk and DynaTrace
Exposure to Hadoop or NoSQL performance optimization and benchmarking tools
Understanding of data security and role-based access controls
Experience in ETL tools such as Talend, Informatica or Ab Initio
Certification in HortonWorks or Cloudera
Familiarity with Agile development
Ability to work independently with limited supervision as well as contribute to team efforts
Strong critical thinking, decision making, troubleshooting and problem solving skills
Outstanding time management skills and attention to detail
Excellent verbal/written communication skills, including communicating technical issues to non-technical audiences

3+ years of hands-on experience in Hadoop Eco System (HDFS, YARN, MapReduce, Oozie and Hive) 1+ year of hands-on experience in Spark core and Spark SQL
1+ year of hands-on experience in Spark core and Spark SQL 1 year of hands-on experience in HBase, Cassandra any other NoSQL DB
Understanding of Distributed computing design patterns, algorithms, data structures and security protocols Candidate must possess a Bachelor''s degree in a computer related field
Understanding of Kafka and Spark Streaming Strong skills in SQL, Analytical SQL functions, Graphs, and Styl


Client : -

             

Similar Jobs you may be interested in ..