Job Description :
Job Role: Enterprise Cloud Architect
100 % Remote Role
Job type: C2C


Job Description:

Looking for an Enterprise Cloud Architect with deep expertise in creating and optimizing data pipeline workloads for specific use cases. Should have experience in creating data pipelines - data ingestion, preparation, integration and operationalization techniques in optimally addressing the data

Requirements:

Demonstrable experience in architecting data workload solutions using Cloud based technologies like Azure Data Factory, Databricks, Snowflake etc
Must have experience in Microsoft Technologies - .net, SQSS, SRSS, Azure
Very strong knowledge of Apache Spark and Python programming
Consultative approach to business imperatives
Partner in a dynamic and exciting agile environment with Scrum Masters, Product Owners, and team members to develop creative data-driven solutions with the ETL pipeline that meet business and technical initiatives
Drive automation through effective use of modern tools, techniques and architectures to completely automate the repeatable and tedious data preparation and integration tasks to improve productivity.
Train counterparts in the data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases.
Promote the available data and analytics capabilities and expertise to business unit leaders and educate them in leveraging these capabilities in achieving their business goals.
Accountable for operational effectiveness in performance, uptime, release management of the enterprise data platform

Experience expectation-

Minimum 12+ yrs of exp

Experience on Azure Cloud and related PaaS - at least 4+

Experience with Hadoop Ecosystem (Spark)– at least 4+

Experience on Databricks- at least 2+ (desirable)

Experience with Snowflake- at least 1+ (desirable)

Client : SVB