Job Description :
Title : AWS Data Engineer
Location : Chicago, IL
Contract : Long term
Interview : Video Interview

1st Position:

Hands on working experience in AWS GLUE ETL tool
ETL process implementation using AWS Glue
Expertise with Apache Spark / Pyspark
Hands on exposure to AWS Analytics services particulary Lambda, Step functions, EMR, Athena, Glue, Data-Pipelines
Good to have - Knowledge on Snowflake ETL integration
Hands-on Experience using programming language – Scala, python, R, or Java
Strong database knowledge
Proven success in communicating with users, other technical teams, and senior management to collect requirements, describe data modeling decisions and data engineering strategy
Agile development and understanding.


2nd Position:

Experience in AWS Cluster(Glue), ECS Fargate, Amazon S3, SNS, SQS, Snowflake, AWS RDS Aurora, Apache Spark with Python.
Knowledge of writing AWS Lambda with Python and Java.
ETL Workflow, Design/Develop/Maintain ETL jobs to and from Snowflakes and Amazon S3.
Develop/Deploy/Maintain Data pipelines using AWS CloudFormation.
Able to read/write files/data from Amazon S3.
Exposure on design and developing Amazon APIs over Lambda and CI / CD of same.
Good knowledge of Cloud Security implementation practices.
Experience in working in Agile teams and working independently with Business stakeholders providing solutions and regular updates.


Thank & Regards,

Kartik Singh

Technical Recruiter| Xchange Software Inc.

10 Austin Avenue, Iselin, NJ - 08830.

E-mail :
             

Similar Jobs you may be interested in ..