Job Description :

GCP Data Engineer

Location: Dallas, TX or Hartford, CT (if candidate is very good can try for remote as exception)

Rate: $40/hr on W2 Only(No C2C)

Long Term Contract

 

 The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.

 

Key Responsibilities:

•  Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).

•  Analyze and map existing Teradata workloads to appropriate GCP equivalents.

•  Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).

•  Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.

•  Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python/Java).

•  Optimize data storage, query performance, and costs in the cloud environment.

•  Implement monitoring, logging, and alerting for all migration pipelines and production workloads.

______________

Required Skills:

•  6+ years of experience in Data Engineering, with at least 2 years in GCP.

•  Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.

•  Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.

•  Experience with ETL/ELT pipelines using tools like Informatica, Apache Beam, or custom scripting (Python/Java).

•  Proven ability to refactor and translate legacy logic from Teradata to GCP.

•  Familiarity with CI/CD, Git, and DevOps practices in cloud data environments.

•  Strong analytical, troubleshooting, and communication skills.

______________

Preferred Qualifications:

•  GCP certification (e.g., Professional Data Engineer).

•  Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.

•  Experience working in the healthcare, retail, or finance domains.

•  Knowledge of data governance, security, and compliance in cloud ecosystems.

 

 

abhishek at dynasticx dot com

             

Similar Jobs you may be interested in ..