Job Description :
Job Title: Senior Data Engineer Location: Cary, NC
Experience: 12+ Years
About the Role

We are looking for an experienced Senior Data Engineer who will lead the design and development of modern data platforms and scalable data pipelines. The ideal candidate has strong hands-on expertise in cloud data engineering, big data technologies, ELT/ETL architecture, and data modeling, along with the ability to mentor teams and work closely with business stakeholders to deliver high-quality data solutions.

Key Responsibilities
  • Architect, design, and implement large-scale data pipelines and data integration workflows across structured and unstructured datasets.

  • Build, optimize, and maintain robust ETL/ELT processes for ingestion, transformation, and delivery of data across enterprise systems.

  • Develop and manage data lakes, data warehouses, and analytics platforms using cloud technologies.

  • Work closely with data scientists, analysts, and business teams to understand data requirements and deliver reliable data for reporting and analytics.

  • Define best practices for data quality, lineage, governance, security, and performance optimization.

  • Lead and mentor junior engineers, participate in design/code reviews, and ensure engineering excellence.

  • Collaborate with cross-functional teams in an Agile environment for solution design and implementation.

  • Own production deployment, monitoring, debugging, performance tuning, and incident management for data pipelines.

Required Skills & Experience
  • 12+ years of experience in Data Engineering, Data Architecture, or similar roles.

  • Strong programming skills in Python / Java / Scala.

  • Expert in SQL and performance tuning for large datasets.

  • Hands-on experience with Big Data ecosystems Hadoop, Spark, Kafka, Hive, HBase, etc.

  • Strong experience with Cloud platforms (AWS / Azure / GCP) and services like:

    • AWS: S3, Glue, EMR, Redshift, Lambda, Kinesis

    • Azure: Data Factory, Synapse, Databricks, ADLS

    • GCP: BigQuery, Dataflow, Pub/Sub

  • Experience with Data Warehouse / Data Lake / Lakehouse design and modeling (Kimball, OLAP, OLTP, Star/Snowflake schemas).

  • Proficiency in CI/CD, Git, Docker, Kubernetes, Airflow or similar orchestration tools.

  • Knowledge of data governance, security, metadata management, and data quality frameworks.

Nice to Have
  • Experience with Databricks / Snowflake / Delta Lake.

  • Exposure to ML pipelines, MLOps, and real-time streaming data processing.

  • Experience in designing scalable solutions for enterprise-level environments.

Education
  • Bachelor's or Master's degree in Computer Science, Engineering, or related field.

             

Similar Jobs you may be interested in ..