Job Description :
Position: Sr Big Data Developer
Location: Sterling, VA or Washington DC.
Phone then F2F
3-6 month +

Scope of Work:
Provide technical expertise in designing and developing bid data solutions based on complex file formats consisting of structured, unstructured and semi-structured data including updates to programs and scripts not limited to maintenance and enhancements of specified datasets. Troubleshoot, review failed jobs and performance issues, and monitor and ensure data integrity.

Provide routine financial industry data maintenance including but not limited to:
Provide technical recommendations to improve efficiency of the Data Management /ETL process.
Provide support to initiate and deploy changes required by the ETL including access request, change request, Technical Review Board (TRB) requests, Standard Operating Procedure (SOP) documentation etc.
Prepare/draft correspondence for the government to dataset vendors/OIT regarding data pull (/feeds) issues using approved communication process.
Create/update/maintain data models, data dictionaries, technical diagrams and other documentation to depict environment and processes.
Support the handover of new datasets and enhancement into production. The handover process will be including conduction testing of new/enhanced ETL process and verify the technical documentation including design document, deployment document, data dictionary.

Experience/ Skills Required:
Minimum of ten (10) years of IT experience.
At least 3 years working with Hadoop tools and technology including Web Services (RESTful/SOAP) & Web Scrapping (HTTP, CSS and HTML Proficient with designing and developing big data solutions. Experience with widely-used Hadoop tools such as Apache Spark, Hive, Sqoop, Flume and Oozie.
Extensive experience working with complex file formats include structured, unstructured and semi-structured data including, but not limited to JSON, XML, CSVetc.
Experience working with cloud based technology such as Amazon EMR, Redshift.
Proficient using Python, Java/Java Script including libraries such as Beautiful Soup, urllib,Selenium. Experience working with databases and appliances such as Netezza, Oracle, SQL Server, PostgreSQL & Sybase. Experience with data modeling.

Education Required:
BA/BS Bachelor''s Degree in Computer Science or related field.
o Certified Developer from a major Hadoop distributer such as Cloudera, HortonWorks and MapR is a major plus.
o Hadoop developer certifications from other distributers may also be considered.