Job Description :
Job description:
Client is looking for a TS4 to support designing Data Models for Operational Data Store and Data Marts used for operational and analytical business reporting. The data sources will include a variety of normalized relational data/semi-structured data on both a Data Lake(Hadoop) and a Relational Database(Vertica Data modeling will encompass normalized(Inmon), dimensional(Kimball), and others designs to support broad agency centric use cases. The candidate must be proficient with data modeling/designing star schema structures and use of modeling tools like Erwin in a data warehouse environment.

During a given day, this position will be expected to:
Work with Business Analysts and potentially business customers to verify requirements
Design/model tables to meet architectural requirements for dimensional/star schema and other modeling approaches
Construct mapping documents involving source and destination and extract/transform/load rules for the ETL team
Work with an agile BI Team to facilitate ETL development and reporting for BI Tools (Cognos/Tableau, etc)
Mandatory Skills:
10 or more years of experience working as a data modeler in business intelligence, analytics and Enterprise Data Warehousing environments
10 or more years of experience developing with SQL
Minimum 5 years Experience with Kimball (Dimensional Modeling) and Inmon methodologies
Minimum 5 years Experience analyzing data profiling, data mining, data cleansing, data content, scrubbing data, and translating data via rules from one database to another
Minimum 5 years Experience with STAR Schema structures, theories, principles, and best practices, Data Governance, Data Quality Management, Metadata Management, and Conceptual and Logical Data Design.
Minimum 5 years Strong Data Modeling skills to include data quality, source systems analysis (DB2, Oracle, VSAM , etc), business rules validation, source to target mapping design, prepare sample data example and data validation with ETL team (SCD 1, SCD 2 types tables and Fact tables)
Minimum 5 years experience with the Erwin modeling tool, including Model Mart Repository advanced features (e.g. check in/out, complete compare, naming standards and reverse engineering)
Desired Skills
Experience designing for Hadoop/Spark
Experience with MPP Databases (Vertica, TerraData, Netezza)
Knowledge of property and casualty insurance systems
Knowledge of Ohio business model and processes
Experience using SQL Development Tools
Skill in the use of organizational tools & methods (MS Excel, MS Word, MS Outlook, VISIO, Clarity)
Ability to quickly adapt to changes, enhancements, and new technologies