Job Description
We are seeking a skilled Data Engineer with strong experience inMicrosoft Fabric, Data Lake, Data Warehouse, SQL, and ETL development. The ideal candidate will have hands-on expertise in designing and building scalabledata pipelines to support enterprise analytics and reporting needs.
Responsibilities
Design, develop, and maintain data pipelines and ETL processes to ingest, transform, and deliver data across the enterprise.
Work with Microsoft Fabric, Data Lake ,
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , Bill Rate: $65-70/hr, , MSP Owner: Kelly Gosciminski, Location: New York, NY - hybrid onsite, Duration: 6 months, GBaMS ReqID: 10272565, , - Experience 10 years of experience in data engineering or a related role., - Python Proficiency Strong proficiency in Python programming, including experience with data manipulation libraries such as Pandas and NumPy., - Data
Apply
[Apply Individually]
|
 |
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate- $60.00 - $70.00 Hourly, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Shilpa Bajpai, Location: Columbus, OH, Duration: 6 months, GBaMS ReqID: 10226237, , Exp: 6 8 Years, Skills: Python, Py Spark, , , We are seeking a highly skilled and motivated Data Engineer to join our innovative team. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES BETWEEN UNDERSCORES, , Bill Rate:, , PTN_US_GBAMSREQID_CandidateBeelineID, Example: PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Michelle Lee, Location: Bentonville, AR, Duration: 6 months, GBaMS ReqID: 10279411, , , Ideal candidates should be:, , *Well versed with Hadoop, Spark, Cloud, PythonScala and Java, Streaming, Kafka, Backend, J2EE. You evangelize an extremely high standard of code quality, system reliability, and performance., *You have a proven tra
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES BETWEEN UNDERSCORES, , Bill Rate:, , PTN_US_GBAMSREQID_CandidateBeelineID, Example: PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Michelle Lee, Location: Bentonville, AR, Duration: 6 months, GBaMS ReqID: 10279412, , , Ideal candidates should be:, , *Well versed with Hadoop, Spark, Cloud, PythonScala and Java, Streaming, Kafka, Backend, J2EE. You evangelize an extremely high standard of code quality, system reliability, and performance., *You have a proven tra
Apply
[Apply Individually]
|
 |
Apply
[Apply Individually]
|
 |
($) : $60k - $130k/year
More >
Apply
[Apply Individually]
|
 |
($) : $60k - $130k/year
More >
Job Title: Senior Data Engineer AWS Location: Columbus, OH (Onsite/Hybrid as per client requirement) Job Summary We are seeking a highly skilled Senior Data Engineer with strong expertise in AWS cloud services, Python, and PySpark to join our team in Columbus, OH. The ideal candidate will design, build, and optimize scalable data pipelines, ensuring high data quality, performance, and reliability. This role requires hands-on technical expertise, strong problem-solving skills, and the abil
Apply
[Apply Individually]
|
 |
Apply
[Apply Individually]
|
 |