Job Description :



Job Title: Big Data Engineer with PySpark Developer (10+year)

Job ID: 2021-10194

Job Location: Remote

Duration: 6 months, extendable

Must Have: PySpark, AWS Glue, ETL

No H1b


  Job Responsibilities:

  • Design and implement ETL routines on AWS.
  • Mentor junior developers on agile engineering best practices through pair programming
  • Advocate for modular, testable code implementations
  • Drive testing automation pyramid and integrate with CI/CD tools for continuous validation.
  • Understands when to automate and when not to.
  • Drive the mentality of code and quality being owned by the entire team.
  • Identify code defects and work with other developers to address quality issues in functional code.
  • Find bottlenecks and thresholds in existing code through the use of automation tools.
  • Passion for continuing education and improving code quality.
  • Operational triage of bugs, failed test cases and system failures.
  • Creating and optimizing infrastructure performance metrics
  • Complete detailed peer code reviews and drive pair programming
  • Deploying & developing cloud applications end to end.
  • Architecting a pilot or PoC effort to bring innovation to delivery.
  • Working in stages of the environment lifecycle
  • Automation of manual data object creation and test cases
  • Ask smart questions, take risks and champion new ideas.


Skills and Experience Required:

  • Hands On PySpark Developer
  • Must have Experience with AWS Glue
  • Experience with AWS Blue Framework
  • ETL Development experience
  • Exposure to Terraform Cloud Development (non-DEVOPS)


, and ,

Animesh Saurrabh

Voto Consulting LLC. | 

1549 Finnegan Lane, 2nd floor North Brunswick, NJ 08902 

Voice | 

Hangout: animeshsaurah07 



Similar Jobs you may be interested in ..