Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Remote - Data Engineer - SSIS
Remote, Work from Home
Remote
Work from Home
00000
Date
: Sep-28-20
2020-09-28
2020-10-28
Remote - Data Engineer - SSIS
Remote, Work from Home
Sep-28-20
Work Authorization
US Citizen
GC
H1B
GC EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
Market
Duration
:
12 Months
Sp. Area
:
Database Developers
Sp. Skills
:
x-Other
Consulting / Contract
CONTRACTOR
Certification Preferred
Direct Client Requirement
Remote Work from Home
Required Skills
:
Big Data, Data Engineer, Machine learning, SQL, SSIS, ACCESS, Agile, ETL, Hadoop, Hbase, JAVA, Kafka, Oracle, Python, Salesforce.com, Scala, Security,
Preferred Skills
:
Domain
:
IT/Software, HealthCare
Work Authorization
US Citizen
GC
GC EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
Market
Duration
:
12 Months
Sp. Area
:
Database Developers
Sp. Skills
:
x-Other
Consulting / Contract
CONTRACTOR
Certification Preferred
Direct Client Requirement
Remote Work from Home
Required Skills
:
Big Data, Data Engineer, Machine learning, SQL, SSIS, ACCESS, Agile, ETL, Hadoop, Hbase, JAVA, Kafka, Oracle, Python, Salesforce.com, Scala, Security,
Preferred Skills
:
Domain
:
IT/Software, HealthCare
youngsoft
Wixom, MI
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
You can direct apply here:
https://youngsoft. secure. force. com/apex/NrichForm?templateId=a1F3n000003Kaqz&jo bid=a0t3n00000CNWdP&sourcetype=TechFetch
Additional key pieces:
a. Experience with Epic Healthcare Data
b. ETL experience using SQL Server / SSIS
Capitalizing on the vast amount of patient data collected to make the patient medical journey a personalized experience. We are seeking data engineers to design and scale databases to support a robust analytical pipeline. The job will promote collaboration between data scientists, data architects, business analysts and clinicians to support user access to data and data infrastructure. The data engineer will play a role in integrating advanced machine learning models into production with continuous integration. The role will require a general understanding of the healthcare system to for data integration across multiple data sources.
Scope of responsibilities
Collaborate with data scientists and business users to build the frameworks required to integrate data pipelines and machine learning models with operations
Maintain database structure and standardize definitions for business users across the company
Clean and verify quality of data prior to feature engineering and advanced analytical modeling
Build unit tests for continuous integration
Work with data architects to build the foundational Extract/Load/Transform processes
Support business users and clinicians in identifying the correct data sets and providing easy to use tools to pull data
Required skills and competencies
Ability to write production-level code in one of the following languages: Python, Hive, Pig, Shell Scripting, SQL, Java or Scala
Ability to structure databases in one of the following platforms: Hadoop, Spark, Oracle/Teradata
Proficiency leveraging the following Big Data technologies to support downstream advanced analytical modeling: Map-Reduce, Spark, Airflow/Oozie, Kafka, Hbase, Pig, No-SQL Databases
Familiarity with data architecture, modelling and security
Qualifications-required
3+ years structuring databases and working with big data
5+ years writing code in relevant languages
Qualifications-preferred
Experience working in agile, sprint-based approach
1+ year working in healthcare or related field
Familiarity with cloud computing clusters
Education
Bachelor's Degree in computer science, mathematics, statistics or related field with 2+ years industry experience
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Sr
ETL
Architect (Snowflake & AWS) $88/h
,
Windsor, CT
Oct-05-25
VALIANT TECHNOLOGIES LLC
($) :
$88
Role: –Sr
ETL
Architect (Snowflake & AWS)Bill Rate: $88/hour C2CLocation: Windsor,CTDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement Job Description: We are seeking an experienced
ETL
Architect with strong expertise in Snowflake and AWS Cloud ecosystem to design, develop, and optimize data integration solutions. The ideal candidate will be responsible for architecting scalable
ETL
pipelines, enabling efficient data movement, transformation, and
Apply
[Apply Individually]
Big Data
Engineer $75/hr Srinivasa Kandi
,
Phoenix, AZ
Oct-16-25
VALIANT TECHNOLOGIES LLC
($) :
$75
Role: –
Big Data
Engineer Bill Rate: $75/hour C2CLocation: Phoenix, AZDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement Job Title:
Big Data
Engineer Job Description:We are looking for an experienced
Big Data
Engineer skilled in Google Cloud Platform (Google Cloud Platform) and BigQuery to design, develop, and optimize large-scale data pipelines and analytics solutions. The role involves working closely with data scientists, architects, and business
Apply
[Apply Individually]
Software Engineer (
Java
)
,
San Jose, CA
Oct-13-25
Robotics technology LLC
($) :
$80
Key Responsibilities: Develop and maintain a robust and scalable
big data
insights platform to drive AI-assisted process automation and deliver actionable insights. Develop solutions that combine state-of-the-art Generative AI techniques with rich data to deliver best-in-class automation and user experience. Design and implement batch and near real-time data pipelines using Spark, Flink, and BigQuery. Design and implement efficient data models in relational and NoSQL databases like MyS
Apply
[Apply Individually]
Remote - Healthcare
Data Engineer
,
Remote, Work from Home
Sep-29-25
Robotics technology LLC
($) :
Market
Job Responsibilities: ·
Teradata
knowledge,
SQL
, DWH/
ETL
process, Data management & Shell Scripting ·Collaborate with cross-functional teams to gather and analyze data requirements. ·Design, develop, and optimize complex
SQL
queries and scripts for data extraction and reporting. ·Work extensively with
Teradata
for querying and performance tuning. ·Analyze large datasets in
Big Data
environments (e.g.,
Hadoop
, Hive, Spark ·Design and support
ETL
workflows, data pipelines, and processes
Apply
[Apply Individually]
Remote - BigData Engineer
,
Remote, Work from Home
Oct-17-25
Robotics technology LLC
($) :
Market
Job Description: ·Build a highly functional and efficient
Big Data
platform that brings together data from disparate sources and allows FinThrive to design and run complex algorithms providing insights to Healthcare business operations. ·Build
ETL
Data Pipelines in Aws Cloud using Aws ADF and Databricks using PySpark and
Scala
. ·Migrate
ETL
Data pipelines from On Prem
Hadoop
Cluster to Aws Cloud. ·Build Data Ingestion Pipelines in Aws to pull data from
SQL Server
. ·Perform Automated an
Apply
[Apply Individually]
SQL
Oracle
Azure Developer
,
New York, NY
Oct-23-25
Sage IT INC
($) :
USD 80
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , MSP Owner: Amy Johnson, Location: ~MOUNT LAUREL~, Duration: 6 months, GBaMS ReqID: 10286686, , Azure Services Expertise in key Azure services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Stream Analytics. Programming Languages Proficiency in languages such as
Python
,
Java
, or
Scala
. Database Systems
SQL
Strong understanding o
Apply
[Apply Individually]
Developer
,
New York, NY
Oct-23-25
Sage IT INC
($) :
USD 70
Job Description:, ALL CAPS, NO SPACES B/T UNDERSCORES, , PTN_US_GBAMSREQID_CandidateBeelineID, i.e. PTN_US_9999999_SKIPJOHNSON0413, , Bill Rate: $65-70/hr, , MSP Owner: Kelly Gosciminski, Location: New York, NY - hybrid onsite, Duration: 6 months, GBaMS ReqID: 10272565, , - Experience 10 years of experience in data engineering or a related role., -
Python
Proficiency Strong proficiency in
Python
programming, including experience with data manipulation libraries such as Pandas and NumPy., - Data
Apply
[Apply Individually]
Python
Developer with Mongodb $90/hr Sri
,
Irving, TX
Oct-04-25
VALIANT TECHNOLOGIES LLC
($) :
$90
Role: –
Python
Developer with MongodbBill Rate: $90/hour C2CLocation: Irving,TXDuration: 12+ months/ long-termInterview Criteria: Telephonic + ZoomDirect Client Requirement Responsibilities: Develop, enhance, and maintain
Python
-based banking applications with a focus on clean, efficient, and reliable code. Design and implement algorithms, data structures, and solutions to optimize application performance. Build and consume RESTful APIs to facilitate seamless data exchange and integrati
Apply
[Apply Individually]
Remote -
Python
Core Developer $85/hr Sr
,
Remote, Work from Home
Sep-28-25
VALIANT TECHNOLOGIES LLC
($) :
$85
Role: –
Python
Core Developer Bill Rate: $85/hour C2C Location: Remote Duration: 12+ months/ long-term Interview Criteria: Telephonic + Skype Direct Client Requirement Responsibilities & QualificationsThe role of Software Developer is work in small team of software developers & automation architects to build and deliver integration, automation & orchestration solutions. Internally, the person works with the virtual team of developers and testers and externally, the person may work wit
Apply
[Apply Individually]
Google Cloud Data Migration Lead
,
Dallas, TX
Oct-18-25
Robotics technology LLC
($) :
70$
Job Details Position: Google Cloud Data Migration Data Migration Team Lead Duration: Long term contract Job Summary: Seeking a Google Cloud
data engineer
to design, build, and maintain scalable and efficient data processing systems on the Google Cloud platform. This engineer will be responsible for the entire data lifecycle, from ingestion and storage to processing, transformation, and analysis. Their work will enable client organizations to make data-driven decisions by providing cl
Apply
[Apply Individually]