Job Description :
Position: Hadoop Developer

Location: Bloomfield, CT or Lafayette, LA or New York, NY

Duration: 12+ months



This Hadoop Developer will be a key contributor to our Services practice and will have the below responsibilities:



Work with product owners to understand desired application capabilities and testing scenarios
Continuously improve software engineering practices
Work within and across Agile teams to design, develop, test, implement, and support technical solutions across a full-stack of development tools and technologies
Write unit tests and automated acceptance tests.

Work on a variety of development projects on a Hadoop platform
Work with business stakeholders and other SMEs to understand high-level business requirements
Work with the Solution Designers and contribute to the development of project plans by participating in the scoping and estimating of proposed projects
Apply technical background/understanding, business knowledge, system knowledge in the elicitation of Systems Requirements for projects
Support other team members in translating requirements and use cases into test conditions and expected results for product, performance, user acceptance, and operational acceptance testing; participate in automation testing of developed systems/solutions
Adhere to existing processes/standards including the project development lifecycle, business technology architecture, risk and production capacity guidelines and escalate issues as required
Prioritize and manage own workload in order to deliver quality results and meet timelines
Support a positive work environment that promote service to business, quality, innovation and teamwork and ensure timely communication of issues/ points of interest



Required qualifications to be successful in this role

This position requires:



5+ years of experience with Big Data tools and technologies including working in a Production environment of a Hadoop Project.
3+ years of experience with SQL, Hive, Impala, Oozie, HDFS, Hue, Git, Mapreduce and Sqoop.
2+ years of Programming experience in Python or any Object Oriented Programming.
2+ years in Test Driven Development (TDD), and/or Continuous Integration/Continuous Deployment (CI/CD) is a plus
Big Data Development using Hadoop Ecosystem including Pig, Hive and other Cloudera tools.
Analytical and problem solving skills, applied to a Big Data environment.
Experience with large-scale distributed applications.
Experience with Agile methodologies to iterate quickly on product changes, developing user stories and working through backlog.
Prefer experience with Cloudera Hadoop distribution components and custom packages.
Traditional Data Warehouse/ETL experience.
Excellent planning, organization, communication and thought leadership skills.
Ability to learn and apply new concepts quickly.
Proven ability to mentor and coach junior team members.
Strong leadership, communication and interpersonal skills.
Ability to adapt to constant changes. Sense of innovation, creativity, organization, autonomy and quick adaptation to various technologies.



Education Requirement:

Bachelor’s degree in Computer Science or a related discipline, at least eight, typically ten or more years of solid, diverse work experience in IT with a minimum of six years’ experience application program development, or the equivalent in education and work experience.