Job Description :
Job Title: Big Data Developer/Engineer
Duration : 1 Year
Location : Multiple Locations

This role may require relocation: Locations: Raleigh, NC; Cary, NC; Charlotte, NC; Seattle, WA; Bellevue, WA; Richmond, VA; Tysons Corner, VA; Sunnyvale, CA; San Jose, CA, Foster, CA; Long Beach, CA; San Francisco, CA; Dallas Area, TX; Austin, TX; Chicago, IL; Moline, IL; Phoenix, AZ; Smithfield, RI; Atlanta, GA; Bentonville, AR; Mason, WI; Owing Mills, MD; Eagan, MN, Weehawken, NJ; Jersey City, NJ; Hillsboro, OR

Qualifications
Basic
Minimum 4 years Design and Development experience in Java/Core Java related technologies
At least 2 years of hands on design and development experience on Hadoop ecosystem (Hadoop, Hbase, PIG, Hive, and MapReduce) including one or more of the following Big data related technologies – Scala, SPARK, Sqoop, Flume, Kafka and Python
Strong understanding of Hadoop fundamentals
Should be a strong communicator and be able to work independently with minimum involvement from client SMEs
Bachelor’s degree or foreign equivalent required. Will also consider three years of relevant work experience in lieu of every year of education
Preferred Skills:
Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture
Must have strong programming knowledge of Core Java - Objects & Classes, Data Types, Arrays and String Operations, Operators, Control Flow Statements, Inheritance and Interfaces, Exception Handling, Serialization, Collections, Reading and Writing Files
Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Perl, and JavaScript
Knowledge of large data sets and experience with performance tuning and troubleshooting
Experience in one or more Domain, Financial, Manufacturing, Healthcare, or Retail
Experience and desire to work in a Global delivery environment
             

Similar Jobs you may be interested in ..