Job Description :
Hadoop Data Engineer - Raleigh, NC
Duration: 3-6 months to hire
Phone and F2F (Will try for Skype for Strong candidate)
Location: Raleigh, NC


Job Description
Minimum Requirements

BS/MS degree in Computer Science, Engineering, Applied Mathematics or a related field or equivalent experience
5+ years of hands-on programming expertise in Scala, Java, SQL, Python
3+ years of experience with large datasets in Hadoop (Cloudera) and Spark Ecosystem
Hands-on experience in Hadoop data storage, data stores (HBase, Cassandra), and tools (Oozie, Sqoop, Flume etc
Well versed in Cloudera (CDH 5.x) to manage security, metadata, lineage, job management, Optimizer, Record Service etc.
Expertise in kafka (distributed logs) and Spark streaming architecture and development
Experience in design and development of SQL on Hadoop applications (Spark SQL, Impala) and Query Optimization
Passionate, self-motivated and willingness to learn

Nice to Have
Expertise in leading cloud technologies like Amazon Web Services
Certification in Hadoop and Spark a plus
             

Similar Jobs you may be interested in ..