Job Details
Job Description
Roles & Responsibilities
- Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services.
- Implement data ingestion from various sources including APIs, databases, and flat files.
- Ensure data quality, integrity, and consistency across all ETL processes.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements.
- Monitor and troubleshoot ETL jobs and performance issues.
- Automate data workflows and implement CI/CD practices for data pipeline deployment.
- Maintain documentation for ETL processes, data models, and data flow diagrams.
- Bachelor s degree in computer science, Information Systems, or related field.
- 12+ years of experience in ETL development and data engineering.
- Hands-on experience with Snowflake including data modeling, performance tuning, and SQL scripting.
- Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch.
- Strong programming skills in Python or Scala for data processing.
- Experience with orchestration tools like Apache Airflow or AWS Step Functions.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.