Job Description :
Skills : Hadoop Developer

Location: Bloomington, IL

Rate:$48-50/hr max

No of opening # 2

Selected analysts should have 5+ years of Java programming experience, with
at least two years of experience in building Hadoop data pipelines in large
enterprises. They should have deep knowledge to Hadoop architecture and
should have used most of the tools listed below to develop and test data
ingestion and data extraction from Hadoop.

For EOC (Enterprise Operations Cluster) environment:
Hive - provides the capability to write and run sql like statements
to query data

Spark - computing framework
Scala - programming language run in Spark
Parquet - Column-oriented storage format
Avro - Row-oriented storage format

For Data Ingestion:

Flume - streaming service for importing data
Kafka - a distributed streaming platform queue capability
Oozie or Cron - to schedule jobs
Prior experience at State Farm doing Hadoop work will be very beneficial.