Job Description :

Job Title: ETL Python Developer

Location: McLean, VA (5 Days Onsite)

Job Summary:

We are seeking an experienced ETL Python Developer to design, develop, and implement robust data pipelines and integration solutions. The ideal candidate will have strong experience in Python-based ETL frameworks, data modeling, and cloud-based data engineering (preferably AWS). You will collaborate closely with data engineers, analysts, and business teams to ensure data integrity, performance, and scalability across the organization.

Key Responsibilities:

  • Design, develop, and maintain ETL pipelines using Python and modern data frameworks.
  • Extract, transform, and load data from multiple structured and unstructured sources.
  • Implement data validation, error handling, and logging mechanisms for ETL workflows.
  • Optimize ETL processes for performance and scalability.
  • Collaborate with data architects to design data models and schemas that meet business requirements.
  • Work with cloud platforms (AWS / Azure / GCP), focusing on storage, compute, and orchestration services (e.g., AWS Glue, Lambda, S3, Redshift).
  • Automate workflows using Airflow, Luigi, or similar orchestration tools.
  • Develop reusable components and scripts for data ingestion and transformation.
  • Conduct unit testing, troubleshoot issues, and ensure data quality and accuracy.
  • Participate in code reviews, agile ceremonies, and continuous integration/deployment (CI/CD) processes.

Required Skills and Qualifications:

  • Bachelor s degree in Computer Science, Information Technology, or related field.
  • 5+ years of professional experience in ETL development and data engineering.
  • Strong programming skills in Python (Pandas, PySpark, SQLAlchemy, etc.).
  • Hands-on experience with SQL and relational databases (PostgreSQL, MySQL, Oracle, SQL Server).
  • Expertise in data extraction, cleansing, transformation, and loading processes.
  • Experience with AWS services like S3, Redshift, Glue, Lambda, and IAM.
  • Familiarity with data orchestration tools (Apache Airflow, Step Functions, etc.).
  • Knowledge of version control (Git) and CI/CD pipelines.
  • Experience working in Agile / Scrum environments.
             

Similar Jobs you may be interested in ..