Job Description :
Job Title: To Implement SAP HANA as toolset to build a data warehouse solution
Location: NYC, NY
Duration 6+Months

Job Description:

To Implement SAP HANA as toolset to build a data warehouse solution.
Consolidate various data sources (Market Data & reference Data) to build analytics towards stronger Enterprise performance management goals & bring in strong data governance practice.

Skills Required:

15+ years’ experience in data warehousing, data integration, consolidation and governance.
10+ years’ experience leading BI teams & experience in consulting and analyzing data for multiple purposes is mandatory.
Need to have solid understanding of Capital markets, Index business and data elements and structures associated with the business domain.
At least 5+ years of experience in implementing enterprise data management using SAP BI&A technologies viz SAP BODS, SAP BW / SAP HANA and MDM.
Solid understanding of SAP MDM knowledge which drive the replication and integration from Non SAP sources and expertise in SAP Metadata is critical to the success of this role.
Expertise in mapping of data from the non SAP systems, or may be proficient in designing the load routine or may have the skills to design all areas of data flows (ETL) or third normal form (3NF) data modeling.
Provides efficient design and construction of systems and subsystems to extract, condition, and load data from operational systems (SAP) and other sources into a defined physical data model, according to business and technical requirements.
Knowledge of one or more of the DM&A MDM tools such as IBM MDM, Oracle MDM, Cloud MDM (ie: Reltio, TIBCO, Informatica)
Deep knowledge of one or more DM&A Metadata and/or Data Quality tools.
Experience with multiple leading business intelligence tools and be able to analyze solution decisions in terms of product evaluation and best practice decisions
Must have good experience or understanding on Big Data implementation projects and Proficiency with , Hive, Pig, Sqoop, Flume, Oozie, MapReduce, YARN, Spark, Storm, Kafka.

Tools / software:

SAP BI&A technologies
SAP BODS, SAP BW / SAP HANA and
MDM architecture.
Data Governance.
Market Data (Bloomberg, Reuters)
Reference Data.

Preferred Skills:

Big Data implementation projects
Hive, Pig, Sqoop, Flume, Oozie,
MapReduce, YARN,
Spark, Storm,
Kafka

Role & Responsibilities:

Customer Facing Responsibility in post-sales activities to understand the business goals towards building a scalable Data architecture and bring in best practices in Data Governance. Interface with developers for performance and data access requirements.
Interface with system stewards to gather requirements.
Define data element requirements including history, timeliness and frequency of data.
Define volume and timing requirements.
Define and Map the ETL interface inputs including technology choices.
Design, develop, and test processes for data acquisition from legacy systems or production databases.
Participate in performance, integration, and system testing.
Experience in designing, developing, and implementing ETL process (batch and real time replication (Change data Capture
Hands-on data replication tool configuration and metadata management tools
             

Similar Jobs you may be interested in ..