Job Description :
Any visa/tax term

Hadoop Developer

Location: San Jose CA

Duration: 3 TO 6 CTH

Top skills:

1. Hadoop fundamentals
2. Hive and SQL
3. Java big plus scala

Responsibilities:

Define recommended processes for data ingestion, management,
transformation and egress
Design and develop ingest, transformation and egress capabilities
leveraging latest tools such as Kafka, Storm, Pig, Hive, Scoop, Flume,
Spark etc.
Assign schemas and create HIVE tables
Facilitate the selection of appropriate Hadoop and Hadoop
ecosystem tools
Establish the reference architecture, processes, standards and
technical framework
Datleanup and transformation as preparation for analysis
Provide design and development expertise for large-scale,
clustered data processing systems
Troubleshoot and debug Hadoop ecosystem run-time issues
Source huge volume of data from diversified data platforms into
Hadoop platform

Requirements:

3+ years of experience in IT with 2+ years working with HDFS
Familiarity with data loading tools like Flume, Sqoop, NiFi
Ability to write MapReduce jobs (Java/Python etc
Experience in scripting languages like Python and/or Spark
Hands on experience in HiveQL
Knowledge of workflow/schedulers like Oozie
Working knowledge in SQL/NoSQL DB (Mongo, Hbase, Casandra, Hawq,
Impala, Presto)