Job Description :

I would like to share an excellent opening Contract “Spark Developer” do go through the details and kindly send me the updated resume.

Location : Riverwood’s , IL

Job Description :

Minimum 3 years of designing, building and operationalizing large scale enterprise data solutions and applications using one or more of BigData data and analytics services in combination with 3rd parties - Spark, Hive, HDFS etc.

Minimum 3 years of designing and building production data pipelines from ingestion to consumption within big data architecture, using Java, Python, Scala etc.

Minimum 3 years of architecting and implementing next generation data and analytics platforms on Hadoop.

Minimum 3 years of experience in performing detail assessments of current state data platforms and creating an appropriate transition path to Hadoop.