Job Description :
Big Data/ETL Developer

Duties and Responsibilities:
Minimum of 4 years’ experience in architecture, design and development of Big data systems
Minimum of 3 years experience with Hadoop technologies like HDFS, Hive, MapReduce and experience with developing ETL using these technologies
Expertise in Data Modelling and building Datamarts
Strong expertise in Spark
Advanced Data/SQL skills
Unix Shell Scripting
Expertise in Pig and Hive is a plus
Candidate will be responsible for designing, developing, testing and supporting ETL jobs
Hands - on experience with IBM Data Stage Version 11.x
Hands - on experience in ORACLE database technologies
Hands - on experience in writing, modifying stored procedures, DDL and DML statements in a relational database environment
Proven experience in delivering software using an Agile / Scrum methodology
Experience using an Agile management tool such as CA Agile Central
Experience using Job scheduling software such as TWS
Relevant Bachelor''s degree or equivalent work experience in a related field


Client : DMI ,Vantiv

             

Similar Jobs you may be interested in ..