Job Description :
Vertica/MPP
Performance tuning using sql in mpp will be a plus
Informatica
Data Pipeline
Shell scripting,
Linux/Unix
AWS service is required S3, EMR
Python is a plus
scheduling, general
Comms Skills
Knowledge of transferring platform
SQL scripting Advanced

Job Description:
Gathering functional requirements, developing technical specifications, and project planning
Align overall strategies and reconcile competing priorities across organization
Design and develop ETL/ELT jobs across multiple platforms and tools including Informatica, Vertica, Hadoop, and Amazon Web Services (AWS)
Data Modeling for Data Warehouse Environments.
Roughly 60-75% hands-on coding
Resolve defects/bugs during QA testing, pre-production, production, and post-release patches
Work cross-functionally with various Intuit teams: Product Development, Product Management, Project Management Office, Sales, Care, Data Architects, Shared Data Services, Data Scientists, Business Data Analysts and fellow Technical Data Analysts.

Qualifications
Minimum 5 years of experience with developing and administering ETL/ELT processes with Informatica Power Center
Minimum 3 years of experience with developing applications on MPP Columnar Analytics Data Warehouse Environments such as Vertica, GreenPlum, and RedShift
Strong SQL experience with creating complex queries
Minimum 5 years of experience with scheduling and monitoring jobs using Enterprise Schedulers such as Tidal, Autosys etc.
Minimum 5 years of experience with Shell scripting on Linux / Unix environments
Familiarity with NoSQL database environments such as Cassandra
Experience working on Amazon Web Services (AWS) is a plus
Solid communication skills: Demonstrated ability to explain complex technical issues to both technical and non- technical audience
Strong understanding of data warehousing principles and data modeling
BS/MS in Computer Science or equivalent work experience
             

Similar Jobs you may be interested in ..