Job Description :
Hadoop consultant
Work Location Address* Hillsboro OR 97124

Contract duration (in months 12

Job Details:

Must Have Skills (Top 3 technical skills only) *
1. Hadoop
2. Spark
3. MapReduce and Hive

Nice to have skills (Top 2 only)

1. Experience with AWS components and services, particularly, EMR, S3, and Lambda
2. Experience with open source NOSQL technologies such as HBase, DynamoDB, Cassandra

Detailed Job Description:

Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem. Ability to design and implement end to end solution

Minimum years of experience*:
Above 5 years

Certifications Needed: Yes

Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
1. Build utilities, user defined functions, and frameworks to better enable data flow patterns
2. Research, evaluate and utilize new technologiestoolsframeworks centered around Hadoop and other elements in the Big Data space
3. Define and build data acquisitions and consumption strategies