Job Description :
Job Title: Lead Hadoop ETL developer
Location: Seattle, WA
Duration: Long term


Job Description:

Great opportunity to help build this team of big data/ETL developers. You will initially spend about 50% of your time hands on developing and 50% in lead activities. As the team expands, your role will be 100% lead.

Job Duties and Responsibilities:

- Collaborate with developers and business users to gather required data and execute ETL programs and scripts on systems
- Perform root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on databases and provide support to all ETL applications.
- Develop and perform tests on all ETL codes for system data and analyze all data and design all data mapping techniques for all data models in systems.

Basic Qualifications:

- At least five (5) years Information Technology experience, including at least one (1) year of hands-on experience programming on a high-scale or distributed system
- Hands-on experience with the Hadoop stack
- Bachelor’s degree in Computer Science, Management Information Systems, or a related field Other Position Requirements:
- Demonstrated experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)
- Demonstrated experience with Hadoop ETL Development tools (e.g Sqoop, Mapreduce, Java, Hive, Sparks and other Hadoop components)
- Demonstrated experience with analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout)
- Experience with "productionalizing" Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)
- Experience with high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertica)
- Demonstrated presentation and communication skills