Job Description :

Dear Partner,

Good Morning ,
Greetings from Nukasani group Inc !, We have below urgent long term contract project immediately available for IT Professional – Data Engineering & Integration, Raleigh, NC, Onsite need submissions you please review the below role, if you are available, could you please send me updated word resume, and below candidate submission format details, immediately. If you are not available, any referrals would be greatly appreciated.

Interviews are in progress, urgent response is appreciated. Looking forward for your immediate response and working with you.

Candidate Submission Format - needed from you
Full Legal Name
Personal Cell No ( Not google phone number)
Email Id
Skype Id
Interview Availability
Availability to start, if selected
Current Location
Open to Relocate
Work Authorization
Total Relevant Experience
Education./ Year of graduation
University Name, Location
Last 4 digits of SSN
Country of Birth
Contractor Type

: mm/dd

Home Zip Code

Assigned Job Details

Job Title : IT Professional – Data Engineering & Integration
Location: Raleigh, NC, Onsite
Rate : Best competitive rate

Position Overview

We are seeking an experienced Data Engineering & Integration Professional to design, build, and maintain high-performance data pipelines that power our Enterprise Data Warehouse (EDW), Operational Data Store (ODS), and Data Marts. The role involves maintaining existing SAS ETL processes while also transitioning to modern ETL/ELT tools for future-state solutions in a cloud and hybrid data environment.

Key Responsibilities

  • Data Pipeline Development: Design, develop, and optimize scalable ETL/ELT processes for data integration across EDW, ODS, and Data Marts.

  • SAS ETL Support: Maintain and enhance existing SAS scripts while preparing for migration to other ETL tools.

  • Data Architecture & Modeling: Create logical and physical data models using relational and dimensional modeling best practices.

  • Cloud Data Solutions: Implement and optimize solutions using Snowflake, AWS Redshift, and Google BigQuery.

  • Performance Optimization: Tune data pipelines and database objects for optimal performance.

  • Version Control & Deployment: Utilize GitLab for source control and CI/CD workflows.

  • Data Integrity: Troubleshoot and resolve data quality, transformation, and integration issues.

Required Technical Skills

  • ETL Tools:

    • Current: SAS ETL (maintenance and enhancement)

    • Future State: SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM DataStage

  • ELT Tools: DBT, Fivetran, AWS Glue

  • Databases: Oracle, Netezza, Teradata, SQL Server

  • Cloud Platforms: Snowflake, AWS, Google BigQuery

  • Programming: Advanced SQL, Python (or other scripting languages)

  • Data Modeling: Relational and dimensional modeling, high-volume ETL/ELT processing

  • Performance Tuning: Expertise in optimizing queries, data pipelines, and DB structures

Preferred Experience

  • Large-scale data warehousing environments

  • Public cloud-based data platforms

  • Advanced data architecture and design patterns

  • GitLab-based CI/CD pipelines

With Gratitude,
Bhavani Recruiting Manager | Nukasani Group Inc |
Email:
540 W Galena Blvd, Suite 200, Aurora IL 60506.
People, Process, Technology Integrator
An E-Verified Company

             

Similar Jobs you may be interested in ..