Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Hadoop developer
Deerfield, IL
Deerfield
IL
60015
Date
: Sep-19-17
2017-09-19
2018-09-19
Hadoop developer
Deerfield, IL
Sep-19-17
Work Authorization
US Citizen
GC
H1B
GC EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect, Senior, Midlevel
Rate/Salary ($)
:
mkt
Duration
:
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Permanent Direct Hire
H1B Sponsorship Available
FULL_TIME
Direct Client Requirement
Required Skills
:
hartonworks, azure HD Insight Platform, Cloudera, Hive, HBase, Spark, Kafka, Oozie, Spark Streaming, Spark Batch
Preferred Skills
:
hartonworks, Cloudera, Hive, HBase, Spark, Kafka, Oozie, Spark Streaming, Spark Batch, Java
Domain
:
Work Authorization
US Citizen
GC
GC EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect, Senior, Midlevel
Rate/Salary ($)
:
mkt
Duration
:
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Permanent Direct Hire
H1B Sponsorship Available
FULL_TIME
Direct Client Requirement
Required Skills
:
hartonworks, azure HD Insight Platform, Cloudera, Hive, HBase, Spark, Kafka, Oozie, Spark Streaming, Spark Batch
Preferred Skills
:
hartonworks, Cloudera, Hive, HBase, Spark, Kafka, Oozie, Spark Streaming, Spark Batch, Java
Domain
:
Eminence Technology Solutions LLC
Somerset, NJ
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Position- Big Data Developer
Job Location: Deerfield, IL
Project Duration: Long-term
Technology Partner Client: Global Logic/Walgreen
Duration- full Time
Domain Technical Skills Skills Level Years of experience
Hadoop development Hortonworks or Cloudera or Azure HDInsight Platform Preferred – Hortonworks/Azure HDInsight 3 + years full implementation experience
Hadoop Stack Hive, HBase, Spark, Kafka, Oozie Must have these skills 3 + years of experience
Spark development Spark Streaming & Spark Batch Must have Spark Batch
Nice to have Spark Streaming 2 + years of experience
Languages 3 + years of Java
Client :
Global Logic LLC,Walgreen
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Enterprise Architect NET, Azure)
,
Chicago, IL
Apr-20-24
Maidentechnologies.com
($) :
Market
Role: Enterprise Architect NET, Azure, Cloud DevOps) - Need Healthcare OR Pharma domain Location: Chicago, IL (Onsite) Contract Job Function: Own the product and work with the technical & product leadership to convert ideas into a great product. Collaborating across Agile teams and operations Work closely with stakeholders Plan and delivery as per customer timelines, set/manage expectations Participate in software development life cycle (e.g., gathering requirements, analy
Apply
[Apply Individually]
Snowflake Data Engineer
,
Chicago, IL
Mar-28-24
Technogen, Inc
($) :
USD 70 / Hourly / C2
Please look for EST and CST time zone profiles. 253160 Need immediate support on below role ASAP. Location: Chicago,IL (Hybrid 3 days a week) Let me know in case of any questions. Responsibilities: Design, implement, and maintain data pipelines on Snowflake, ensuring scalability, reliability, and performance. Develop and optimize data ingestion processes from various sources, including Azure Blob Storage, Azure Data Lake, databases, APIs, and streaming data sources. Imple
Apply
[Already Applied]
[Apply Individually]
Senior Cassandra Consultant
,
Pleasant Prairie, WI
Mar-28-24
Everest Consulting Group
($) :
Market
DataStax Database Admin / Senior Cassandra Consultant Employment Type:Contract Contract 1 year + Possibility of extension Work location: Remote/Onsite (mention no. of Onsite days) Onsite – 5 days a week @ Pleasant Prairie, Wisconsin. About the Role looking for a DataStax Database Administrator with strong DBA experience to handle complex multi-cluster
NoSql
Database environments. Strong Experience on Aapche Cassandra,strong data modelling ,Strong on Cassandra tools Job Responsibilities • Install
Apply
[Already Applied]
[Apply Individually]
Remote - Oracle DBA
,
Deerfield, IL
Apr-03-24
Intone Networks
($) :
USD 65 / Hourly / C2
ROLE: Oracle DBA LOCATION: Hybrid in Deerfield, IL (This role is temporarily remote. The selected candidate must report in a hybrid capacity when client confirms return to office date.) Is there flexibility to increase the Vendor billing rate? Yes Job Description- Must be capable of handling higher size databases ~40TB size (multiple databases) Migration experience from Oracle 12C to 19C SQL server experience in designing, implementing and administering database system on multiple MS SQL
Apply
[Apply Individually]
AWS Java Developer
,
Chicago, IL
Mar-28-24
Technogen, Inc
($) :
USD 80 / Hourly / C2
Job Title - IT Software Engineer 3 - AWS Java Developer Location: Peoria, IL or Chicago IL Hybrid ; 2 days in the office Duration: 12 Months Top Skills: Strong Java development, Spring, Spring boot, AWS Self-starter, effective communicator, Development, Testing and Deployment and Bug fixing on enhancement Gathering the requirements from the product owners API Development on backend Cucumber for writing test scripts Functional testing- some sprints are in design, coding, working with operation
Apply
[Already Applied]
[Apply Individually]
Data Management Developer
,
Chicago, IL
Apr-10-24
tanishasystems
($) :
BASED ON EXPERIENCE
Role: Data Management DeveloperLocation: Chicago IL( Only Fulltime-)JOB SUMMARY:The Data Management Specialist will be responsible for managing and optimizing our organization's data infrastructure, ensuring data quality, and implementing data solutions using PySpark, DataBricks, Snowflake, and/or Redshift. The ideal candidate will have a strong background in data management, programming, and cloud-based data platforms.Key Responsibilities:1. Develop and maintain data pipelines using PySpark to
Apply
[Apply Individually]