Job Description :
Job Purpose: This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.

Required Job Qualifications:

· Must have qualifications –

o Bachelor Degree and 4 years Information Technology experience OR Technical Certification and/or College Courses and 6 year Information Technology experience OR 8 years Information Technology experience.

o Hands on experience in developing, and maintaining software solutions in Hadoop cluster.

o Experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog, Pig scripts, Hive QL, UDF

o Hands-on experience with Spark and Scala.

o Experience in Kafka

o Familiarity with Hadoop Best Practices, Troubleshooting and Performance tuning.

o Experience with change management / DevOps tools (Github / Jenkins etc

o Familiarity to SDLC Methodology (Agile / Scrum / Iterative Development

· Nice-to-have qualifications

o Working experience with developing MapReduce programs running on the Hadoop cluster using Java/Python.

o Experience with NoSQL Databases like HBASE, Mongo or Cassandra

o Experience using Talend with Hadoop technologies.

o Working experience in the data warehousing and Business Intelligence systems

o Business requirements management and Systems change / configuration management. Familiarity with JIRA.

o Experience in ZENA (or any other scheduling tool)

o Healthcare experience


Client : -

             

Similar Jobs you may be interested in ..