Job Description :

We have immediate requirement for Hadoop Data Engineer with our direct client located at San Jose, CA. Please let me know if you have any suitable candidates on this ASAP.

Position: Hadoop Data Engineer
Location: San Jose, CA
Duration: Long Term Contract
Interview: Video/F2F

Job Description:
MS/BS in Computer Science / related technical field with 10+(level 5) years of strong hands-on experience in enterprise data warehousing / big data implementations & complex data solutions and frameworks
Strong SQL, ETL, scripting and or programming skills with a preference towards Python, Java, Scala, shell scripting
Demonstrated ability to clearly form and communicate ideas to both technical and non-technical audiences.
Strong problem-solving skills with an ability to isolate, deconstruct and resolve complex data / engineering challenges
Results driven with attention to detail, strong sense of ownership, and a commitment to up-leveling the broader IDS engineering team through mentoring, innovation and thought leadership

Desired skills:
Familiarity with streaming applications
Experience in development methodologies like Agile / Scrum
Strong Experience with Hadoop ETL/ Data Ingestion: Sqoop, Flume, Hive, Spark, Hbase
Strong experience on SQL and PLSQL
Nice to have experience in Real Time Data Ingestion using Kafka, Storm, Spark or Complex Event Processing
Experience in Hadoop Data Consumption and Other Components: Hive, Hue HBase, , Spark, Pig, Impala, Presto
Experience monitoring, troubleshooting and tuning services and applications and operational expertise such as good troubleshooting skills, understanding of systems capacity, bottlenecks, and basics of memory, CPU, OS, storage, and networks.
Experience in Design & Development of API framework using Python/Java is a Plus
Experience in developing BI Dash boards and Reports is a plus

Client : Confidential