Job Description :

ETL Developer that works directly with business users to develop data pipelines from source applications to corporate data platforms (Data Warehouse/Lake) and optimize analytic solutions. This position will utilize Informatica development tools to create integrations or modify current integrations for various projects that are handed off to the Data and Analytics Services group.
These accountabilities are not intended as a complete list of specific duties and responsibilities and does not limit or modify the right of any supervisor to assign, direct, and control the work of employees assigned to this job.
Primary Accountabilities
*Develops scalable, innovative solutions for extracting, transforming, and moving data.
*Performs data profiling and data analysis to assist in understanding the data.
*Optimizes data models through normalization, table analysis and review of queries.
*Presents outcomes of analytic models in a format easily understood by the business.
*Performs ETL/ELT functions for new or existing sources.
*Conducts requirements gathering with business stakeholders and users to plan a roadmap of project activities.
*Works with the business to assess the validity of the data and achieve minimum levels of quality.
*Gathers and compiles data using various data collection techniques in response to data requests.
*Creates code or GUI data mappings while following company data standards and approaches.
*Collaborates with others responsible for pipeline development. Supervision
The requirements listed below are representative of the experience, knowledge, skills and/or abilities required to perform this job. Educational Requirements Education Level Field of Study Required/Preferred Bachelor's Degree Emphasis in CIS, Software Development, analytics, operations research, computer science or other applicable field
Required Experience
*4 years of direct experience in a related field (coding, ETL Development/Engineering, analytics, or research)
*Experience with Informatica ETL/ELT tools, specifically PowerCenter, BDM/DEI, IICS products
*Experience working with "Big Data" environments such as Hadoop/Oracle Exadata/Azure/Snowflake
*Proficiency in SQL required.
Preferred Experience
* Experience in developing ETL/ELT pipelines utilizing Python
* Experience with Snowflake Data Cloud
*Data Modeling experience utilizing Erwin Data Modeling tool
*Experience with Tidal Job Scheduling Software
*Development experience with Workday and/or HR domain data
* Ideal candidate would also have working Python experience

Required Skills : Informatica, Snowflake
Basic Qualification :
Additional Skills :
Background Check :Yes
Drug Screen :Yes
Notes :
Selling points for candidate :
Project Verification Info :
Candidate must be your W2 Employee :No
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set : :No
Type of project :Development/Engineering
Master Job Title :ETL Developer
Branch Code :Madison

Similar Jobs you may be interested in ..