Job Description :

Job Title : GCP Data Engineer

Location : New York, NY

Duration : 06+ Months, with extension

Key Responsibilities:

• Design, build, and maintain scalable data pipelines on GCP, primarily using BigQuery.

• Develop and manage DBT models to transform raw data into clean, tested, and documented datasets.

• Write complex and optimized SQL queries for data extraction, transformation, and analysis.

• Implement and maintain data warehousing solutions, ensuring performance, scalability, and reliability.

• Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions.

• Monitor and troubleshoot data pipeline performance and data quality issues.

• Automate data workflows and tasks using Python scripts where necessary.

• Ensure data governance, security, and compliance standards are met.

Required Qualifications:

• Bachelor’s degree in Computer Science, Information Systems, or a related field.

• 8+ years of experience in data engineering or a similar role.

• Strong expertise in SQL with the ability to write efficient, complex queries.

• Proficiency in DBT for data modeling and transformation

• Hands-on experience with BigQuery and other GCP data services.

• Solid understanding of data warehousing principles and best practices.

• Basic to intermediate skills in Python for scripting and automation.

• Familiarity with version control systems like Git.

• Excellent problem-solving and communication skills.

             

Similar Jobs you may be interested in ..