Job Description :
Role: Big Data-Hadoop Developer (Qarth Pipeline)
Location: Sunnyvale CA 94086
Duration: 3 + Month (possible extension)

Job Description:
Develop Java and Scala programs for Qarth Pipeline using Cassandra, Kafka, Apache Storm, REST APIs and JSON.

Minimum Qualifications Bachelor''s Degree in Computer Science or related field and 6+ years’ experience building scalable e-commerce applications
2+ years’ experience with big data methodologies involving Cassandra, Kafka and Apache Storm (MANDATORY)
5+ years of experience with building scalable, high performing and robust Java applications
5+ years’ experience developing using J2EE technologies such as Servlet/JSP/Filters, JNDI, JDBC, JMS, JMX, RMI, Java Web Services or related skill
5+ years’ experience developing with web/app containers such as Web Logic, Web Sphere, Apache/Tomcat, Jboss or related skill
5+ years’ experience with advanced scripting skills in at least one of the following: Python, Perl or Shell and willingness to learn new technologies
Experience with Eclipse or other IDE development tools
Experience with Continuous Integration and related tools (i.e. Jenkins, Hudson, Maven)
Experience with Code Quality Governance related tools (Sonar, Gerrit, PMD, FindBugs, Checkstyle, Emma, Cobertura, JIRA, etc)
Experience with Source Code Management Tools (GitHUB, SVN, CVS, Clearcase)
Knowledge of standard tools for optimizing and testing code
Ability to operate effectively and independently in a dynamic, fluid environment