Hi, Hope you are doing well !! I have an urgent position. Kindly go through the Job description and let me know if this would be of interest to you. Job Title: Sr. Data Engineer Location: Hybrid work in Raleigh, NC (local needed) Duration: 12+ Months Contract While sharing resume mention consultant location and visa status* Job Description: The client is requesting that candidates to provide 2 references prior to final selection. The references must be recent from a Sr. Manager/Director
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate65, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Andres Villegas, Location: Plano, TX, Duration: 6 months, GBaMS ReqID: 10241708, , Role Description: Must Have - Snowflake developer with 6+ years experience and knowledge in SQL, Python, working with EC2, GitHub, JIRA. Should have Data Engineer role (6 yearsNice to Have MFT, Autosys, Jenkins, Postman, Competencies: Digital : Snowflake, E
|
 |
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate- $60.00 - $70.00 Hourly, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Shilpa Bajpai, Location: Columbus, OH, Duration: 6 months, GBaMS ReqID: 10226237, , Exp: 6 8 Years, Skills: Python, Py Spark, , , We are seeking a highly skilled and motivated Data Engineer to join our innovative team. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data
|
 |
Job Details
ETL Data Engineer
Tech Stack & Core Skills:Advanced SQL (Big Data focused)Data Visualization (preferably Tableau)AWS (EMR, Lambda, S3, Step Functions, Athena )ETL Data EngineeringData AnalysisHeavyAI
Soft Skills:Effective CommunicationAnalysis PresentationSelf-starterCuriousGrowth MindsetCollaboration
Additional Preferences:PythonpySparkData ModelingStatistical Analysis
We are an equal opportunity employer. All aspects of employment including the decision to hire, promo
|
 |
Need senior profile
We are seeking a highly skilled Data Engineer with a strong background in SQL optimization and database performance engineering. This role will focus on upgrading, tuning, and enhancing the reliability, scalability, and efficiency of our SQL-based data infrastructure. You will partner with engineering, analytics, and operations teams to ensure that our data pipelines, queries, and reporting systems run efficiently at scale with minimal downtime.
Key Responsibilities
A
|
 |
Role: –Big Data Engineer Bill Rate: $75/hour C2CLocation: Phoenix, AZDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement
Job Title: Big Data Engineer
Job Description:We are looking for an experienced Big Data Engineer skilled in Google Cloud Platform (Google Cloud Platform) and BigQuery to design, develop, and optimize large-scale data pipelines and analytics solutions. The role involves working closely with data scientists, architects, and business
|
 |
Job DescriptionWe are seeking a highly skilled Snowflake Data Engineer with expertise in Matillion ETL to join our client's dynamic data engineering team. The ideal candidate will play a crucial role in designing, building, and maintaining scalable and high-performance data pipelines within the Snowflake ecosystem using Matillion. This role requires expertise in data modeling, data transformation, and cloud technologies while focusing on ensuring quality, performance, and accuracy for data-drive
|
 |
Responsibilities:
Lead the design, development, and implementation of data solutions using AWS and Snowflake.
Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
Develop and maintain data pipelines, ensuring data quality, integrity, and security.
Optimize data storage and retrieval processes to support data warehousing and analytics.
Provide technical leadership and mentorship to junior data engineer
|
 |
Role: –Senior Data Engineer Bill Rate: $80/hour C2CLocation: Scottsdale,AZDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement
Hold a bachelor's degree in Computer Science, Information Systems, or a related field.
Bring 10+ years of experience in data engineering or closely related roles.
Extensive experience building and maintaining ETL/ELT pipelines for both batch and streaming data.
Advanced SQL skills (complex joins, window functions, CTEs)
|
 |
JOB DESCRIPTION:
·Bug Fixes in the Databricks environment
·Ability to Monitoe, Transform and optimize ETL pipelines for Databricks and Knowledge of Data Lakehouse Architecture and knowledge of Pyspark (At least Mid-Level)
·Experience in complex data migration and familiarity with the knowledge is a plus
·Ensure data accessibility and integrity for the migrated objects
·Collaborate effectively with cross-functional teams.
·Communicate progress and challenges clearly to stakeholders.
|
 |