Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , Bill Rate: $65-70/hr, , MSP Owner: Kelly Gosciminski, Location: New York, NY - hybrid onsite, Duration: 6 months, GBaMS ReqID: 10272565, , - Experience 10 years of experience in data engineering or a related role., - Python Proficiency Strong proficiency in Python programming, including experience with data manipulation libraries such as Pandas and NumPy., - Data
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate$60.00 - $80.00 Hourly NOOT, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Rob Finton, Location: Atlanta, GA, Duration: 6 months, GBaMS ReqID: 10289397, , Digital : Application Programming Interface (API), Digital : Amazon Web Service(AWS) Cloud Computing, Advanced Java Concepts, Angular 13+, , , Top 5 Must have skillsets, 1. Java, J2EE, Spring, Spring Batch, JS, Angular, Shell Scripting
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES BETWEEN UNDERSCORES, , Bill Rate:, , PTN_US_GBAMSREQID_CandidateBeelineID, Example: PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Michelle Lee, Location: Bentonville, AR, Duration: 6 months, GBaMS ReqID: 10279481, , , Job Title: Technical Architect Facility Maintenance Platform, Experience: 10+ years, Start Date: Within 2 weeks, , Overview:, We are seeking a Technical Architect to lead the design and technical vision of a new, enterprise-grade facility ma
Apply
[Apply Individually]
|
 |
Job Description:, , ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate $XX - $XX, , PTN_US_GBAMSREQID_Candidate BeelineID, , i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Michelle Lee, Location: Bentonville, Duration: 6 months, GBaMS ReqID: 10279479, , Role Description:, Experience in building n-tier highly scalable, fault tolerant, reactive Microservices using JavaScalaGo, Camel, Spring, Apache Tomcat, Jboss and RESTful architecture., Building Cutting Edge next generation Reactive Microsystem
Apply
[Apply Individually]
|
 |
Role: –SQL Server & SSIS ETL developer Bill Rate: $80/hour C2CLocation: Warren, NJDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement
Job Description -1. Lead and guide developers responsible for ETL development, data pipelines, and data warehouse loads.2. Design, develop, and optimize complex SQL queries, stored procedures, functions, and views.3. Build and maintain workflows for data ingestion, cleansing, transformation, and transfer using SQL Serve
Apply
[Apply Individually]
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate $60.00 - $80.00 Hourly NOOT, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Rob Finton, Location: Atlanta, GA or Minneapolis, MN, Duration: 6 months, GBaMS ReqID: 10289415, , Digital : Amazon Web Service(AWS) Cloud Computing, Digital : Artificial Intelligence(AI), Digital : Python for Data Science, AI & Gen AI, Products & Tools, Advanced Java Concepts, , TOP FIVE SILLSETS, 1. Work/educat
Apply
[Apply Individually]
|
 |
Responsibilities:
Contribute to the design, development, and deployment of the firm’s central data marketplace platform, ensuring scalability, performance, reliability, and security to serve enterprise-wide business needs.Architect and build modern, user-friendly, and highly responsive full-stack applications that enable seamless discovery, access, and governance of data assets across the organization.Architect and build event-driven processes such as data subscriptions and distribut
Apply
[Apply Individually]
|
 |
Job Description
We are seeking a skilled Data Engineer with strong experience inMicrosoft Fabric, Data Lake, Data Warehouse, SQL, and ETL development. The ideal candidate will have hands-on expertise in designing and building scalabledata pipelines to support enterprise analytics and reporting needs.
Responsibilities
Design, develop, and maintain data pipelines and ETL processes to ingest, transform, and deliver data across the enterprise.
Work with Microsoft Fabric, Data Lake ,
Apply
[Apply Individually]
|
 |
Apply
[Apply Individually]
|
 |