Job Description :

Job Title: Senior ETL Developer

Location: North Carolina

MOH : Onsite / W2
Experience Level: 10–15 years

job Summary

We are seeking a highly experienced ETL Developer to design, develop, and optimize enterprise-level data integration solutions. The ideal candidate will have a deep understanding of ETL processes, data warehousing, and performance tuning, along with experience working in complex data ecosystems. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate with cross-functional teams to ensure the accuracy, integrity, and performance of data pipelines.

Key Responsibilities

Design, develop, and maintain robust ETL processes to support data movement, transformation, and loading across multiple systems.Analyze business and data requirements to translate them into technical specifications and ETL workflows.Optimize ETL performance, including data loading, transformation logic, and job scheduling.Work closely with data architects, analysts, and application teams to ensure consistent and accurate data delivery.Implement best practices for data quality, validation, and governance.Troubleshoot and resolve issues in existing ETL jobs and data pipelines.Develop and maintain technical documentation for ETL design, mappings, and workflows.Participate in data migration and modernization initiatives, including cloud data integration.Ensure compliance with data security, privacy, and audit standards.

Required Skills & Qualifications

Bachelor’s degree in Computer Science, Information Systems, or related field (Master’s preferred).10–15 years of experience in ETL development and data warehousing.Strong proficiency in one or more ETL tools such as Informatica PowerCenter, Talend, SSIS, DataStage, or Azure Data Factory.Advanced SQL and PL/SQL skills with strong database experience (Oracle, SQL Server, Snowflake, or similar).Hands-on experience with data modeling, data integration, and data quality frameworks.Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data architectures (e.g., data lakes, data mesh).Strong understanding of data warehousing concepts (Star/Snowflake schemas, slowly changing dimensions, etc.).Experience with scripting languages (Python, Shell, etc.) for automation.Excellent analytical, problem-solving, and communication skills.

Preferred Skills

Experience with real-time data integration and streaming platforms (Kafka, AWS Glue, etc.).Exposure to DevOps, CI/CD pipelines, and version control tools (Git, Jenkins).Working knowledge of data governance and metadata management tools.Background in finance, healthcare, or retail domains is a plus.