Job Description :
Role: Snowflake Data Architect Location: Remote Duration: 12 Months Required Skills : - 2-4 years experience as Snowflake Architect in an AWS environment. - Must have full stack experience: data engineering/pipeline experience, data modeling and data architecture experience. - SQL expertise: ability to develop new/enhance existing data mart models. Basic Qualifications : - Agile/Scrum environment - AWS - Developing new/enhance existing data mart models - Additional Skills : - Matillion experience (designing and deploying data pipelines using Matillion) - Python expertise - Building a data warehouse on Snowflake and loading data in AWS S3 buckets Job Description : We are looking for someone with 2-4 years of Snowflake Architect experience in an AWS environment. Candidates must have full stack experience: ETL/ELT pipeline engineering, data modeling and data architecture experience. This individual must have SQL expertise to develop new and enhance existing data mart models. Responsibilities: - Responsible for designing Enterprise warehouse, suggesting improvements to Residential Data Mart, guidance to data engineering team on technology leading practices. - Lead Scrum Team in developing and enhancing new data models that meet the business requirements and backlog priorities. - Collaborate with existing environment Lead Data Architect and Scrum Team to iteratively develop new views and tables for curation to other product teams - Provide expertise in Snowflake configurations, schemas, and keys to continuously improve performance - Work closely with Data Engineer, Scrum Team and ETL Lead to consume and refine staged assets from AWS for curation. Must Have: - 2-4 years as Snowflake Architect working in agile/scrum environment, additional years of experience as a business intelligence or operational data store architect in any environment. - SQL expertise: ability to develop new/enhance existing data mart models. - AWS Nice to Have: - Matillion experience (designing and deploying data pipelines using Matillion) - Python expertise - Building a data warehouse on Snowflake and loading data in AWS S3 buckets Notes : Candidates will do a 30-minute Teams interview with the Architect at Waste Management and offers will be made. This position is remote requiring the individual to work Central time zone. -- -- -- , Krishnasree | US IT Recruiter Thought wave software and solutions 314 N, Lake St, Suite 6, Aurora IL 60506 Mobile Email: Website: LinkedIn:


Client : NO

             

Similar Jobs you may be interested in ..