Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Hadoop Developer
Bloomington, IL
Bloomington
IL
61701
Date
: Aug-18-17
2017-08-18
2018-08-18
Hadoop Developer
Bloomington, IL
Aug-18-17
Work Authorization
US Citizen
GC
H1B
GC EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
48 to 50/hr max
Duration
:
6 months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
CONTRACTOR
Required Skills
:
Hadoop, Avro, Cluster, Hive, JAVA, Scala
Preferred Skills
:
Domain
:
IT/Software
Work Authorization
US Citizen
GC
GC EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Senior
Rate/Salary ($)
:
48 to 50/hr max
Duration
:
6 months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
CONTRACTOR
Required Skills
:
Hadoop, Avro, Cluster, Hive, JAVA, Scala
Preferred Skills
:
Domain
:
IT/Software
Pace Computer Solutions Inc
Columbia, MD
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Skills : Hadoop Developer
Location: Bloomington, IL
Rate:$48-50/hr max
No of opening # 2
Selected analysts should have 5+ years of Java programming experience, with
at least two years of experience in building Hadoop data pipelines in large
enterprises. They should have deep knowledge to Hadoop architecture and
should have used most of the tools listed below to develop and test data
ingestion and data extraction from Hadoop.
For EOC (Enterprise Operations Cluster) environment:
Hive - provides the capability to write and run sql like statements
to query data
Spark - computing framework
Scala - programming language run in Spark
Parquet - Column-oriented storage format
Avro - Row-oriented storage format
For Data Ingestion:
Flume - streaming service for importing data
Kafka - a distributed streaming platform queue capability
Oozie or Cron - to schedule jobs
Prior experience at State Farm doing Hadoop work will be very beneficial.
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Data Engineer
,
Peoria, IL
Apr-05-24
Technogen, Inc
($) :
USD 75 / Hourly / C2
Job Title: Data Engineer Location: Peoria, IL. Duration : Long term contract role Exp-10+ years. Mandatory Skills: -AWS Glue -Apache Spark / Pyspark -
Java
-Neo4j Graph DB.
Apply
[Already Applied]
[Apply Individually]