Job Description :

Our client is looking for a Spark /Scala Developer with AWS for Long Term project in Chicago, IL , below is the detailed requirement.

Job Title : Spark /Scala Developer with AWS
Location : Chicago, IL
Duration : Long Term

Job Description:
Bachelor''s degree in Computer science or equivalent, with minimum 7+ years of relevant experience.
You should have experience in 5 years of minimum experience in Big data and Spark.
Work experience in Java and J2EE technologies.
You Should have minimum 3 years in Hive, Impala and Sqoop and 2 years in spark.
Work experience in ingestion, storage, querying, processing, and analysis of Big data with hands on experience in Big data Eco- system related technologies like Map Reduce, Spark, Hive, Impala, Sqoop, HBase, Kafka
Good understanding of Spark Algorithms such as Classification, Clustering, and Regression
Experience working with Spark tools like RDD transformations, spark SQL and spark MLlib
Demonstrate excellent communication skills including the ability to effectively communicate with internal and external customers.
Ability to use strong industry knowledge to relate to customer needs and dissolve customer concerns and high level of focus and attention to detail.
Strong work ethic with good time management with ability to work with diverse teams and lead meetings.