Job Description :
o Develop functional understanding of the business and apply towards building business facing marts
o Design, develop, code and document Data Marts, analyze current Data Mart structure to identify existing tables and functional design.
o Build Data Models using combination of various warehousing concepts like Snowflake, Star Schema, etc.
o Knowledge of Big Data landscape as demonstrated through experience or certification. Eg., Hadoop, Spark, Hive, Pig, Impala.
o Working Experience in ETL Tools like DataStage, Informatica and procedures including PL/SQL.
o Build new aggregates based on current and future business needs.
o Build new tables based on current and future business needs and populate the data mart via queries, jobs etc.
o Implementing ETL process for integration of data from disparate sources leveraging Hadoop Open Source frameworks.
o Design and implement a scalable and high performance search engine for queries on unstructured data.
o Build distributed, reliable and scalable data pipelines to ingest and process data in real-time.
o Works cross-functionally to address issues and emerging needs in software systems.
o Rigorously tests software in preparation for deployment.
o Works on problems where analysis of situations or data requires a review of a variety of factors.
o As a seasoned, experienced professional with a full understanding of area of specialization, he/she resolves a wide range of issues in creative ways.
o Works on problems of diverse scope where analysis of data requires evaluation of identifiable factors.
o Demonstrates good judgment in selecting methods and techniques for obtaining solutions.
o Coordinates release of work product into production environment.
             

Similar Jobs you may be interested in ..