Job Description :

Role: Big Data With Python  

Location: San Ramon, CA

Duration: Long Term

Required Skills:

 

Python, Hadoop, Spark, Kafka, Scala

 

The successful candidate will build data solutions using state of the art technologies to acquire, ingest and transform big datasets. You will design and build data pipelines to support applications and data science projects following software engineering best practice.

Design and develop data applications using big data technologies (Hadoop, MapR, AWS) to ingest, process, and Analyse large disparate datasets. Build robust data pipelines on Cloud using Airflow, Spark/EMR, Kinesis, Kafka, Lambda or other technologies.

Build the infrastructure required for optimal extraction, transformation, and loading of data from various data sources using SQL and AWS 'big data' technologies.

Work with data and analytics experts to strive for greater functionality in our data systems. Implement architectures to handle large scale data and its organization.

Anish Antrive

Senior Recruiter

Nityo Infotech Corp.
Suite 1285, 666 Plainsboro Road
Plainsboro , NJ , 08536

Cell

Desk EXT 4005

Email:

LinkedIn: 



Client : Nityo Infotech

             

Similar Jobs you may be interested in ..