Job Description :
J.O# : 46520 WAF

Job Title: Hadoop Developers

Contract: Length: 6+ months contract

Location: Bentonville, Arkansas

Hourly Rate: Open (DOE)

Big Data Developers (Hadoop Hortonworks Distribution)

· Background in Architectural Design and Implementing Large Solutions.

· 5-10 years’ Experience in big data development:

· Experience in Big Data Technology: Hadoop, Hive, Spark

· Experience in Google Cloud and other Cloud technologies.

· Knowledge of LLAP, Presto, Cassandra/MongoDB, AWS, Druid

· Expert in one of these languages Java, Scala, Python, PySpark

Required skills:

Building Hadoop Data Lake, - Hortonworks/Cloudera

Transformations to bring the data from various sources, mainframe, Teradata, mysql

Hadoop – spark programming mainly with scala or python, pyspark.

But okay with Java Hive query’s Hana, Trying out Druid

GCP that everything is being migrated to. Azure would be better exp, but AWS.

Presto, Druid, Cassandra, or Mongo

Unix and Shell Scripting

For the analytic platform, dealing with bringing data into multiple sources for reporting

Could be joining 1 of 2 teams

Additional Information:
·
2-3 years’ experience

· Any cloud experience (AWS or Azure)

· Hadoop Data Lake

· Know Java, Spark, or Scala

· Druid, presto, other big data experience