Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Remote -
Data engineer
3+ years needed (SQL,
Python
, Scala, Hadoop and Hive/HQL)
Remote, Work from Home
Remote
Work from Home
00000
Date
: Feb-12-21
2021-02-12
2022-02-12
Remote -
Data engineer
3+ years needed (SQL,
Python
, Scala, Hadoop and Hive/HQL)
Remote, Work from Home
Feb-12-21
Work Authorization
US Citizen
GC
H1B
L2 EAD, H4 EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Midlevel
Rate/Salary ($)
:
DOE
Duration
:
12 months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
x-Other
Consulting / Contract
CONTRACTOR
Remote Work from Home
Required Skills
:
Data Engineer
, Hadoop,
Python
, Scala, SQL
Preferred Skills
:
Domain
:
IT/Software
Work Authorization
US Citizen
GC
L2 EAD, H4 EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Midlevel
Rate/Salary ($)
:
DOE
Duration
:
12 months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
x-Other
Consulting / Contract
CONTRACTOR
Remote Work from Home
Required Skills
:
Data Engineer
, Hadoop,
Python
, Scala, SQL
Preferred Skills
:
Domain
:
IT/Software
3A SOFT INC
Piscataway, NJ
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Title: Sr.
Data Engineer
Location: REMOTE
Duration: 1+ Year
15 positions
Main focus is going to be around
SQL,
Python
,
Scala,
Hadoop and Hive/HQL.
Contact:
Venkatesh
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Big Data
Engineer $75/hr Srinivasa Kandi
,
Phoenix, AZ
Oct-16-25
VALIANT TECHNOLOGIES LLC
($) :
$75
Role: –
Big Data
Engineer Bill Rate: $75/hour C2CLocation: Phoenix, AZDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement Job Title:
Big Data
Engineer Job Description:We are looking for an experienced
Big Data
Engineer skilled in Google Cloud Platform (Google Cloud Platform) and BigQuery to design, develop, and optimize large-scale data pipelines and analytics solutions. The role involves working closely with data scientists, architects, and business
Apply
[Apply Individually]
Senior
Big Data
Engineer $77/hr Srinivas
,
Atlanta, GA
Oct-27-25
VALIANT TECHNOLOGIES LLC
($) :
$77
Role: –Senior
Big Data
EngineerBill Rate: $77/hour C2CLocation: Atlanta ,GADuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement Summary:The team supports a portal and product landing page, which supports all marketing functions of the website. Currently, they are operating under one monolithic application and want to decouple into loosely coupled microservices. The individual they are looking for will be focused on the data side of the project and make
Apply
[Apply Individually]
Lead
Data Engineer
,
Jersey City, NJ
Oct-10-25
Robotics technology LLC
($) :
Market
Responsibilities: Lead the design, development, and implementation of data solutions using AWS and Snowflake. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Develop and maintain data pipelines, ensuring data quality, integrity, and security. Optimize data storage and retrieval processes to support data warehousing and analytics. Provide technical leadership and mentorship to junior
data engineer
Apply
[Apply Individually]
Remote -
BigData
Engineer
,
Remote, Work from Home
Oct-17-25
Robotics technology LLC
($) :
Market
Job Description: ·Build a highly functional and efficient
Big Data
platform that brings together data from disparate sources and allows FinThrive to design and run complex algorithms providing insights to Healthcare business operations. ·Build ETL Data Pipelines in Aws Cloud using Aws ADF and Databricks using PySpark and
Scala
. ·Migrate ETL Data pipelines from On Prem
Hadoop
Cluster to Aws Cloud. ·Build Data Ingestion Pipelines in Aws to pull data from
SQL
Server. ·Perform Automated an
Apply
[Apply Individually]
Databricks
Data Engineer
,
Dallas, TX
Oct-06-25
Robotics technology LLC
($) :
Market
JOB DESCRIPTION: ·Bug Fixes in the Databricks environment ·Ability to Monitoe, Transform and optimize ETL pipelines for Databricks and Knowledge of Data Lakehouse Architecture and knowledge of Pyspark (At least Mid-Level) ·Experience in complex data migration and familiarity with the knowledge is a plus ·Ensure data accessibility and integrity for the migrated objects ·Collaborate effectively with cross-functional teams. ·Communicate progress and challenges clearly to stakeholders.
Apply
[Apply Individually]
Remote -
Data Engineer
,
Remote, Work from Home
Oct-12-25
Robotics technology LLC
($) :
Market
Need senior profile We are seeking a highly skilled
Data Engineer
with a strong background in
SQL
optimization and database performance engineering. This role will focus on upgrading, tuning, and enhancing the reliability, scalability, and efficiency of our
SQL
-based data infrastructure. You will partner with engineering, analytics, and operations teams to ensure that our data pipelines, queries, and reporting systems run efficiently at scale with minimal downtime. Key Responsibilities A
Apply
[Apply Individually]
Sr ETL Architect (Snowflake & AWS) $88/h
,
Windsor, CT
Oct-05-25
VALIANT TECHNOLOGIES LLC
($) :
$88
Role: –Sr ETL Architect (Snowflake & AWS)Bill Rate: $88/hour C2CLocation: Windsor,CTDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement Job Description: We are seeking an experienced ETL Architect with strong expertise in Snowflake and AWS Cloud ecosystem to design, develop, and optimize data integration solutions. The ideal candidate will be responsible for architecting scalable ETL pipelines, enabling efficient data movement, transformation, and
Apply
[Apply Individually]
Developer - L4
,
New York, NY
Oct-30-25
Sage IT INC
($) :
$60k - $130k/year
Job DetailsJob Description Role: Azure Databricks DeveloperLocation: Louisville, KY (Day 1 Onsite)Job Description:The Senior
Data Engineer
will be responsible for the build of Enterprise Data platform.Setting up the data pipelines that are scalable, robust and resilient and build pipelines to validate, ingest, normalize/enrich and business-specific processing of healthcare data. Build Azure Data Lake leveraging Databricks technology to consolidate all data across the company and serve the data
Apply
[Apply Individually]
Google Cloud Data Migration Lead
,
Dallas, TX
Oct-18-25
Robotics technology LLC
($) :
70$
Job Details Position: Google Cloud Data Migration Data Migration Team Lead Duration: Long term contract Job Summary: Seeking a Google Cloud
data engineer
to design, build, and maintain scalable and efficient data processing systems on the Google Cloud platform. This engineer will be responsible for the entire data lifecycle, from ingestion and storage to processing, transformation, and analysis. Their work will enable client organizations to make data-driven decisions by providing cl
Apply
[Apply Individually]
Python
Engineer
,
New York, NY
Oct-30-25
Sage IT INC
($) :
USD 75
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , Bill Rate: $75 MAX, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Bader Almubarak, Location: Plano, TX, Duration: 6 months, GBaMS ReqID: 10251268, _, ,
Data Engineer
, , Design, build, and monitor ETL/ELT pipelines for scalable data processing., Automate data ingestion using APIs and scripting tools., Implement automated health checks and pipeline monitoring systems., Work with tec
Apply
[Apply Individually]