Job Description :
During a given day, this position will be expected to:
Work with Business Analysts and potentially business customers to verify requirements
Follow design documents involving source and destination databases
Interpret design documents to understand mapping and transformational requirements
Design, develop and test ETL/ELT processes using Python/PySpark/SQL to move data to/from relational and star-schema based data structures.
Optimize ETL/ELT processes to fit in a desired time frame
Build batch flows with an Enterprise Batch Scheduler

Mandatory Skills:
10 or more years of experience in design, development, modification and testing of ETL processes
10 or more years of experience developing with SQL
10 or more years of experience building ETL processes using SQL/Python and/or ETL tool experience (such as Pentaho, Informatica, IBM InfoSphere DataStage, Talend)
5 or more years of software development experience with Relational databases (i.e. Oracle, SQL Server, DB2)
5 or more years of experience developing ETL processes against dimensional/star-schema based data structures including fact tables (commonly referred to as Kimball

Desired Skills
Experience developing and modifying ETL/ELT processes with Hadoop/Spark
Experience with MPP Databases (Vertica, TerraData, Netezza)
Experience scheduling ETL processes
Experience with Cognos/Tableau business intelligence tools.
Experience developing against normalized data structures (commonly referred to as Inmon)
Knowledge of property and casualty insurance systems
Knowledge of Ohio business model and processes
Experience using SQL Development Tools
Experience with Red Hat Enterprise Linux
Experience developing and modifying scripts in a Linux or Unix environment
Experience in measuring and tuning the performance of batch systems in a Linux or Unix environment .
Skill in the use of organizational tools & methods (MS Excel, MS Word, MS Outlook, VISIO, Clarity)
Ability to quickly adapt to changes, enhancements, and new technologies

Client : Direct Client