Job Description :
Role name: Bigdata Hadoop Engineer
Location: Sunnyvale, CA
Duration (Months) 6

Experience : 4-6 (Years)
Competencies: Big Data Testing (Hadoop Module), Python, Apache Spark, Kafka,Scala, HBase, Microservices, Elastic Path, Core Java

Essential Skills:
* Experience with Big Data technologies (Hadoop and other frameworks in Hadoop ecosystem, Elasticsearch, etc
* Working experience with building ML models, Neural Networks, or any Deep Learning techniques.
* Extensive experience solving analytical problems using quantitative approaches, operations research and optimization algorithms
* Comfort manipulating and analyzing complex, high-volume, high dimensionality data from varying sources
* Fluency and facility with one or more of the following programming languages: Java, Scala, Python.
* Knowledge of tools such as SPARK, TensorFlow, Microservices, Kafka, HDFS, HBase
* Excellent written and oral communication skills, able to communicate with all levels of internal technology teams and business teams
Role Description
3 + years of working in ops role supporting infrastructure using cloud technology
Hands on with one of the following cloud computing platforms: Amazon Web Services (AWS), Google (GCP) Cloud, or Microsoft Azure.

Client : Direct Client