Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
BigData Developer
Chicago, IL
Chicago
IL
60701
Date
: Jan-05-19
2019-01-05
2019-12-26
BigData Developer
Chicago, IL
Jan-05-19
Work Authorization
US Citizen
GC
H1B
L2 EAD, H4 EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
DOE
Duration
:
6 Months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
CONTRACTOR
Required Skills
:
BigData, "Mark Logic", Agile
Preferred Skills
:
Domain
:
HealthCare, Insurance
Work Authorization
US Citizen
GC
L2 EAD, H4 EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
DOE
Duration
:
6 Months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
CONTRACTOR
Required Skills
:
BigData, "Mark Logic", Agile
Preferred Skills
:
Domain
:
HealthCare, Insurance
Vedainfo Inc
Torrance, CA
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Description:
Experience in Agile Methodologies
Knowledge of Healthcare, Insurance industry
Must have Knowledge in Mark Logic, BigData Technology
Anticipate, Understand and Respond to the Needs of Internal and External Clients
Responsibilities:
Work Cooperatively and Effectively with the Offshore Team and Client
Work in Agile Framework
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Python Full Stack Developer Locals to NY
,
Chicago, IL
Apr-18-24
KE Staffing
($) :
open
6-8 years of Python development. Must be 10/10 in coding. Prior Retail banking experience and highly prefer prior Capital One Duties: Working on IVR (telephoning system for bank app customers- work on manual options, for balance inquiry etc), transferred to agent. Whole IBR + call routing to agents. All applications and experience is managed by this team. App maintenance, additional worforce. Support full stack on AWS, serverless based architecture, lambdas, python is main language MU
Apply
[Already Applied]
[Apply Individually]
Data Modeler
,
Pleasant Prairie, WI
Apr-26-24
Everest Consulting Group
($) :
Market
Analytics Data Modeler Onsite – 2 days a week / 3 days remote @ Pleasant Prairie, Wisconsin. Contract :6 to 12 months + Possibility of extension Handon Experience on ERWIN, SQL, DATA dimensional modeling Job Description Role: Analytics Data Modeler Employment Type: Fulltime/Contract Contract 6 to 12 months + Possibility of extension Work location: Remote/Onsite (mention no. of Onsite days) Onsite – 2 days a week / 3 days remote @ Pleasant
Apply
[Already Applied]
[Apply Individually]
Enterprise Architect NET, Azure)
,
Chicago, IL
Apr-20-24
Maidentechnologies.com
($) :
Market
Role: Enterprise Architect NET, Azure, Cloud DevOps) - Need Healthcare OR Pharma domain Location: Chicago, IL (Onsite) Contract Job Function: Own the product and work with the technical & product leadership to convert ideas into a great product. Collaborating across
Agile
teams and operations Work closely with stakeholders Plan and delivery as per customer timelines, set/manage expectations Participate in software development life cycle (e.g., gathering requirements, analy
Apply
[Apply Individually]
Data Management Developer
,
Chicago, IL
Apr-10-24
tanishasystems
($) :
BASED ON EXPERIENCE
Role: Data Management DeveloperLocation: Chicago IL( Only Fulltime-)JOB SUMMARY:The Data Management Specialist will be responsible for managing and optimizing our organization's data infrastructure, ensuring data quality, and implementing data solutions using PySpark, DataBricks, Snowflake, and/or Redshift. The ideal candidate will have a strong background in data management, programming, and cloud-based data platforms.Key Responsibilities:1. Develop and maintain data pipelines using PySpark to
Apply
[Apply Individually]