Job Description :

Position:  Sr. Data Analyst/ Architect

Location: San Jose, CA & Raleigh, NC

 

Remote till Covid

 

Job Description: 

ETL Data Analyst/Architect with Snowflake, Python and GCP

ü  8- 10 years of experience in Data analysis and data management

ü  Build and Deploy Data Pipelines on Google Cloud to enable Data Standardization & Data Quality capabilities.

ü  Demonstrate expertise in Snowflake data modeling and ETL using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts

ü  Responsible for implementation and ongoing Support of Big Data & Cassandra infrastructure.

ü  Working with data delivery teams to setup new Big Data Pipelines. This job includes setting up users, setting up Kerberos principals and testing Data Pipelines

ü  Lead & drive the development of cloud-based data warehouses & data management platforms

ü  Design, Build & Manage Data Pipelines to ingest structured and Semi-Structured Data.

ü  Experience leading data warehousing, data ingestion, and data profiling activities

ü  Demonstrate expertise in Snowflake data modeling and ETL using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts

ü  Collaborate with various business and technical teams to gather requirements around data management rules and process

ü  Propose the optimization of these rules if applicable, then design and develop these rules with ETL logic

ü  Understand the business requirement or the downstream process requirement

ü  Deliver ETL solution implementation and support

ü  Provide impact analysis whenever there are any changes and production events happen that may affect Data Quality

ü  Monitor throughout the project delivery life cycle in order to manage the scope, schedule, and cost for IT projects

             

Similar Jobs you may be interested in ..