Job Description :
Dear,
We have an urgent requirement of Hadoop with Spark and Scala for Contract position; if you interested or have any Consultant please share profile ASAP.

Please find job description.
Role: Hadoop with Spark and Scala
Location: Charlotte, NC / Pennington, NJ
Duration: 12 months

Good knowledge of database structures, theories, principles, and practices Experience in Java and Knowledge in Hadoop (HDFS/HBASE/KUDU/SPARKSQL and SPARK/Scala with or prior experience in MapReduce) concepts and ability to write Spark/Scala RDD jobs. Proven understanding with Hadoop, HBase, Hive, and HBase Ability to write Pig Latin scripts. Familiarity with data loading tools like Flume, Kafka and Sqoop. Knowledge of workflow/schedulers like Oozie. Good aptitude in multi-threading and concurrency concept. Loading data from disparate data source sets. Certifications like Cloudera Developer/Administrator Certifications added advantage. Hands on experience with at least two NO SQL databases. Ability to analyze, identify issues with existing cluster, suggest architectural design changes. Ability/Knowledge to implement Data Governance in Hadoop clusters

Required Skills:
Hands on experience in coding for the following Hadoop Eco System:
Spark/Scala,
Spark/RDD
Spark SQL
HIVE
Oozie
Autosys
Restful Services

Desire Skills:
KUDU
HBASE
Machine Language/Predictive analytical
Spark streaming
             

Similar Jobs you may be interested in ..