($) : USD 80.9000000000000
More >
Role - Network Data Architect Domain - Telecom with OSS/BSS is a must Detailed Job Description: Primary skill - Cloud, Big data, data warehouse, data modeling, Microservices, K8s/Docker In the role of Data Architect, you will be a senior-level strategic professional responsible for designing, building, and managing the organization's data architecture, strategies, and solutions. This role ensures the integrity, security, availability, and usability of data across the enterprise to support b
Apply
[Apply Individually]
|
 |
Role: –SQL Server & SSIS ETL developer Bill Rate: $80/hour C2CLocation: Warren, NJDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement
Job Description -1. Lead and guide developers responsible for ETL development, data pipelines, and data warehouse loads.2. Design, develop, and optimize complex SQL queries, stored procedures, functions, and views.3. Build and maintain workflows for data ingestion, cleansing, transformation, and transfer using SQL Serve
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , Bill Rate: $65-70/hr, , MSP Owner: Kelly Gosciminski, Location: New York, NY - hybrid onsite, Duration: 6 months, GBaMS ReqID: 10272565, , - Experience 10 years of experience in data engineering or a related role., - Python Proficiency Strong proficiency in Python programming, including experience with data manipulation libraries such as Pandas and NumPy., - Data
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate: $75 MAX, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Bader Almubarak, Location: Plano, TX, Duration: 6 months, GBaMS ReqID: 10251264, _, , Data Engineer, , Design, build, and monitor ETL/ELT pipelines for scalable data processing., Automate data ingestion using APIs and scripting tools., Implement automated health checks and pipeline monitoring systems., Work with tec
Apply
[Apply Individually]
|
 |
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, PTN_US_GBAMSREQID_CANDIDATEBEELINEID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , Bill Rate $75 95/hr, , MSP Owner: Felix Avalos, Location: Irvine, ,CA, Duration: 6 months, GBaMS ReqID: 10219388, , , Data Center Engineer, , Design and develop data pipelines using , Matillion or ADF and Snowflake Hands on experience in designing and developing ETL Framework tables . Need to know SFDC and Data Modeling Design solution leveraging Snowflake native ca
Apply
[Apply Individually]
|
 |
About HCLTechHCLTech is a global technology company, spread across 60 countries, delivering industry-leading capabilities centered around digital, engineering, cloud and AI, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. We re powered by our people a gl
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate $59/hr, PTN_US_GBAMSREQID_CANDIDATEBEELINEID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Felix Avalos, Location: Irving CA, Duration: 6 months, GBaMS ReqID: 10219387, , , , , Data Center Engineer, , o Design and develop data pipelines using , Matillion or ADF and Snowflakeo Hands on experience in designing and developing ETL Framework tables .o Need to know SFDC and Data Modeling o Design solution leveraging Snowflake nati
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate- $65.00 Hourly, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Shilpa Bajpai, Location: Cleveland, OH, Duration: 6 months, GBaMS ReqID: 10295618, , Role:, Exp: 8- 10 Years, Skills: Big Data and Hadoop Ecosystems, Banking and Financial Technology, Ab Initio, Teradata, MySQL, Unix / Linux Basics and Commands, , Lead and own all technical aspects of ETL projects from requirement till implem
Apply
[Apply Individually]
|
 |
Job Description
We are seeking a skilled Data Engineer with strong experience inMicrosoft Fabric, Data Lake, Data Warehouse, SQL, and ETL development. The ideal candidate will have hands-on expertise in designing and building scalabledata pipelines to support enterprise analytics and reporting needs.
Responsibilities
Design, develop, and maintain data pipelines and ETL processes to ingest, transform, and deliver data across the enterprise.
Work with Microsoft Fabric, Data Lake ,
Apply
[Apply Individually]
|
 |