Job Description :
Role:Hadoop Developer
Location:DC Area
Long Term Contract
Only want candidates with financial, mortgage, or banking experience.

F2F required so local candidates only.

Responsibilities

Manage data extraction jobs, and build new data pipelines from various structured and unstructured sources into Hadoop.
Execute or support test certification scripts, analyze and validate results against established success criteria.
Support patching and incident and problem management activities.
Enable environment sustainability through technology training and development of user guides and knowledge artifacts.
Provide work guidance or technical assistance to less senior engineers.
Qualifications

Bachelor’s degree in computer science or related field.
3+ years of strong hands-on experience in developing code in Hadoop using Spark (Scala), Hive, Pig, Sqoop etc.
5+ years of strong hands-on experience coding in JAVA, Python, Unix shell scripting, relational databases (RDBMS), and PL/SQL.
Strong foundational knowledge in Unix, Hadoop architecture and AWS landscape.
Experience in Test Driven Development and code/delivery of unit/integration test framework on AWS.
Nice to have skills: R
Must be a self-starter, detail oriented and possess strong problem-solving skills.
Excellent communication, time management, and technical presentation skills.
             

Similar Jobs you may be interested in ..