Job Description :
Skills & Experience:
Strong development skills around Hadoop, Sqoop, Hive, Pig Latin, Spark, Scalding, Map Reduce
Knowledge of Spark, Storm, Kafka, NoSQL databases
Excellent problem solving and analytical skills
Proven background in Distributed Computing, Data Warehousing, ETL development, and large scale data processing
Strong development skills in Python and SQL preferred.
Build ingestion programs
Using spark scala pig- move to spark scala.
Requirement is Big Data people in a cloud env.
AWS/GCP
SCALA
             

Similar Jobs you may be interested in ..