Job Description :
Role: ETL Hadoop Developer
Location: Portland, OR
Duration: Long Term

2+ years'' Experience in Big Data technologies and utilities (Hadoop, Spark SQL, Hive, Impala, Pig, Kafka) is required.
Experience with MPP databases like Greenplum, Netezza is preferred. Ability to troubleshoot issues and develop functions in an MPP environment is highly desired.
Very good knowledge of the software development life cycle, agile methodologies, and test driven development
3 Experience utilizing and extending ETL solutions (e.g., Informatica, Talend, Pentaho, Ab Initio) in a complex, high-volume data environment is highly desired
3+ years of SQL experience and ETL development is required
3+ year of programming experience in either Java, Python and/or other functional programming skills
Working hands-on experience with scheduling & data integration tools like Control-M and Ni-Fi is highly desired.
Sound understanding of continuous integration & continuous deployment environments
Solid understanding of application program interfaces (APIs), messaging software and interoperability techniques and standards
Strong analytical skills with a passion for testing
Excellent problem solving and debugging skills
Team collaboration maintaining exceptional code, architecture and documentation
Strong exposure in Data Management, Governance and Controls functions
Bachelor''s or Advanced Degree in Information Management, Computer Science, Mathematics, Statistics, or related fields desired