Job Description :
Role : Python Developer Location : Glendale, AZ Description: The Data Integration Developer is responsible for the design, build, and deployment of the project's data integration component. Designs, analyzes modifies, debugs and evaluates complex programs for application areas. Provides continuous software application evolution to drive and enable greater business value and increased productivity through more-rapid delivery of high-quality applications. Responsible for cost-effective delivery of software applications, including the optimal selection of implementation technologies, practices and skills for application delivery success. Receives no instruction on routine work and general instructions on new projects or assignments. Designs, codes, modifies, debugs and evaluates programs for business functional areas. Formulates logic for new systems, devises logic procedures, prepares flowcharting, and data analysis through the application of professional programming concepts. Codes, tests and delivers application enhancements and project build deliverables. Determines how existing complex applications, legacy systems, databases, Web interfaces and/or hardware logic, which may be currently operating on multiple platforms, work together to meet the new and emerging enterprise requirements. Develops methods to efficiently reuse existing components. Recommends and implements changes in development, maintenance and system standards. Holds and participates in code walkthroughs to ensure that all code is production ready and complies with all standards including but not limited to: architectural, PCI, Model Audit Rule and internal audit standards. Develops design specifications and parameters for assigned applications or components of larger integrated solutions that are in compliance with products' architectural blueprints. Ensures that code is to set quality standards including: SDLC, Architecture, PCI compliance, Model Audit Rule compliance and internal audit guidelines. Develops complex test plans in conjunction with the Q/A Lead. Reviews results of testing. Develops fixes for bugs discovered. Develops conversion and system implementation plans. Gathers, analyzes, prepares and summarizes recommendations for approval of system and programming documentation. May assist in development of product user manuals Participates in component and data architecture design, software product evaluation and buy vs. build recommendations for moderate to high complexity solutions. Works in an office environment sitting at a desk, table or computer workstation for long periods of time. May travel by car, plane or other forms of transportation to attend business meetings or conferences. Approximately 50-80 percent of time spent on the job involves the use of a personal computer. Responsibilities: Works closely with the business and IT teams to create scalable data integrations processes. Designs, builds and maintains data integration processes in development, QA and production. Design, build and maintain Extract Transform & Load (ETL) processes and design data models. Works with System owners to resolve source data issues and refine transformation rules. Ensures performance metrics are met and tracked. Performs data analysis for both Source and Target tables/columns, Provides technical documentation of Source and Target mappings. Supports the development and design of the internal data integration frame work. Participates in design and development reviews. Works with System owners to resolve source data issues and refine transformation rules. Ensures performance metrics are met and tracked. Writes and maintains unit tests. Conduct QA Reviews. Works with data modelers & DBAs in implementing the dimensional & 3NF Physical data models for Insurance data warehouse. Contribute to collaborative, multi-disciplinary project team efforts in an agile environment. Knowledge, Skills & Abilities: Required: Bachelor's degree in related area (Computer Science, Information Systems, Engineering) or an equivalent combination of education and experience. 8+ years Design, testing and applications development experience of processes for Data Warehouse and Data Integration. Experience in creating extracting and transformation data using Python. Experience in using Informatica or other ETL tools. Experience in different AWS Cloud services such as S3, AWS EC2. Experience in loading data and analytics in Snowflake. Experience in Rest and SOAP API connections and data integrations using APIs. Familiar ODI or OIC data integration tools. Familiar with Kafka Queues and event base integrations. Familiarity in extracting data from and to flat file and relational data sources. Highly proficient in handling large volumes of data in Insurance and Financial domains Skilled with tuning data pipeline, identifying and resolving performance bottlenecks in various levels. Experience in SQL and NoSQL databases. Practiced in data profiling and data quality assurance with ETL tools. Experience in writing Shell Scripting Proficient in creating and writing technical documentation System Knowledge - Understands business and customer requirements that drive the analysis and design of technical solutions. Excellent communication skills, highly motivated, ability to work independently and in a team. At ease in high stress environments requiring superior ability to effectively handle multi-task levels of responsibility. Preferred: Understanding of industry practices and AAA NCNU policies and procedures relating to work assignments. Strong background in working with data warehouses and data marts based on the IIA-IIW insurance framework. Knowledge of Enterprise Application Integration EAI methodologies.
             

Similar Jobs you may be interested in ..