Job Description :
Required Function 3:Qualifications:
8+ years of experience in managing data lineage and performing impact analysis.
5+ years of experience with any ETL tool development.
4+ years of experience with Hadoop Eco System.
Experience working in Data Management projects.
Experience working in Hive or related tools on Hadoop, Performance tuning, File Format, executing designing complex hive HQL’s, data migration conversion.
Experience working with programing language like Java/Scala or python.
Experience working in agile environment.
Experience working with Spark for data manipulation, preparation, cleansing.
Experience working with ETL Tools (Informatica/DS/SSIS) for data Integration.
Experience designing and developing automated analytic software, techniques, and algorithms.
Ability to handle multiple tasks and adapt to a constantly changing environment.
Self-starter with the ability to work independently and take initiatives.
Ability to translate ideas and business requirements into fully functioning ETL workflows.
Ability to apply mastery knowledge in one of the relational data base (DB2, MSSQL, Teradata, Oracle 8i/9i/10g/11i)
Expert ability and hands on experience in SQL is a must.
Experience with Unix/Linux and shell scripting.
Strong analytical and problem solving skills.
Excellent written and oral communication skills, with the ability to articulate and document processes and workflows
             

Similar Jobs you may be interested in ..