Job Description :

Role: –Sr ETL Architect (Snowflake & AWS)
Bill Rate: $88/hour C2C
Location: Windsor,CT
Duration: 12+ months/ long-term
Interview Criteria: Telephonic + Zoom
Direct Client Requirement

Job Description:

  • We are seeking an experienced ETL Architect with strong expertise in Snowflake and AWS Cloud ecosystem to design, develop, and optimize data integration solutions. The ideal candidate will be responsible for architecting scalable ETL pipelines, enabling efficient data movement, transformation, and integration across enterprise systems to support business intelligence, analytics, and advanced data initiatives.

Key Responsibilities

  • Design and implement ETL architecture, frameworks, and best practices for large-scale data integration projects.
  • Architect and optimize Snowflake-based data warehouse solutions including schema design, performance tuning, and query optimization.
  • Lead the migration of existing data platforms to AWS cloud-native solutions.
  • Develop robust data pipelines using ETL/ELT tools, SQL, and Python/Scala.
  • Integrate data from diverse sources (databases, APIs, streaming platforms, flat files, SaaS applications).
  • Implement data governance, metadata management, and security best practices across data pipelines.
  • Collaborate with data engineers, analysts, and business stakeholders to ensure high-quality, reliable data delivery.
  • Drive automation of data workflows, monitoring, and error handling.
  • Provide technical leadership, mentoring, and architectural guidance to engineering teams.

Required Skills & Qualifications

  • 10+ years of experience in data engineering, ETL development, or data architecture.
  • Proven expertise with Snowflake (Data Modeling, Virtual Warehouses, Query Performance Tuning, Security & Access Control).
  • Strong knowledge of AWS services such as S3, Redshift, Glue, Lambda, EC2, RDS, CloudFormation, and IAM.
  • Hands-on experience with ETL/ELT design and implementation using tools like Informatica, Talend, Matillion, AWS Glue, or equivalent.
  • Strong programming and scripting skills (SQL, Python, Shell, Scala or Java).
  • Solid understanding of data lake and data warehouse architectures.
  • Experience with streaming technologies (Kafka, Kinesis, Spark Streaming) is a plus.
  • Familiarity with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC) for data deployments.
  • Excellent problem-solving, communication, and leadership skills.

Preferred Qualifications

  • Snowflake certification (SnowPro Architect/Advanced).
  • Experience with modern data orchestration frameworks (Airflow, dbt, Step Functions).
  • Background in big data ecosystems (Hadoop, Spark).

Note: If you are interested, please share your updated resume and suggest the best number & time to connect with you. If your resume is shortlisted, one of our IT Recruiter from my team will contact you as soon as possible

Srinivasa Reddy Kandi

Client Delivery Manager

Valiant Technologies LLC

Equal Opportunity Employer:

We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate based on race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law

             

Similar Jobs you may be interested in ..