Job Description :

Job Description:
Own critical data pipelines running in production environment.
Work with technology team to deliver data projects, including the migration od legacy workflows to new systems.
Develop and deploy ETL job workflow with reliable error/exception handling and rollback.
Develop and engineer a process for ensuring the integrity of all data assets
Provide technical assistance in identifying, evaluating, and developing systems and procedures.
Working in close partnership with cross-functional teams across the organization to define business rules and cleansing requirements for the data transformation process.
Be the knowledge expert on the data stored in the databases maintained and/or consumed by the BI and Analytics team.
Partner with key business stakeholders to establish Data goals and objectives and help to achieve those goals
Build and maintain metadata dictionaries to assist in the governance of metrics and flow of data between systems.
In-depth exploration of causes, trends, opportunities and specific actions required regarding key Metrics, KPIs, and additional Data Elements

3+ years of experience in various Big Data platforms and tools ( Hadoop, Spark, Hive, Sqoop) for building ETL workflows (3+ Years
Excellent Performance tuning and optimization techniques (3+ Years
5+ years of experience with various forms of data design patterns such as OLAP, NoSQL, EDW, data marts.
Perform data analysis in insight discover process. (3+ years
Support team in developing best practices across the data life cycle management.
Experience with ETL tools (e.g. Pentaho) is a plus
Support dashboard development in Tableau.

Client : Confidential