Job Description :
Hadoop Developer

Location: San Jose, CA,

Job Description:
Engineer should come from the background of a senior Hadoop developer.
Position will be in support of data curation, ingestion, management, and client
consumption.
Individual has an absolute requirement to be well versed in Big Data fundamentals such
as HDFS and YARN.
More than a working knowledge of Sqoop and Hive is required with understanding of
partitioning/data formats/compression/performance tuning/etc.
Preferably, the candidate has a strong knowledge of Spark on either Python or Scala. Basic
Spark knowledge is required.
SQL for Teradata/Oracle is required. Knowledge of other industry ETL tools (including No
SQL) such Cassandra/Drill/Impala/etc. is a plus.
The candidate should be comfortable with Unix and standard enterprise environment
tools/tech such as ftp/scp/ssh/Java/Python/SQL/etc.
E: r a h u l . a @ i t s c i e n t . c o m
C: 5 1 0 5 1 6 7 8 0 9


Client : L&T