-
Provide technical leadership in modernizing legacy data ingestion, ETL/ELT, and databases to cloud technologies (AWS/Azure).
-
Demonstrate a self-driven, ownership mindset to navigate ambiguity, resolve constraints, and mitigate risks with minimal supervision.
-
Implement data access, classification, and security patterns that comply with regulatory standards (PII, locational data, contractual obligations, etc.).
-
Build strong relationships with technical teams through effective communication, presentation, and collaboration skills.
-
Collaborate with stakeholders, business analysts, and SMEs to translate business requirements into scalable solutions.
-
Integrate data from multiple sources into cloud-based architectures, collaborating with cross-functional teams.
-
Work closely with data scientists, analysts, and stakeholders to meet data requirements with high-quality solutions.
-
Function within a matrixed team environment, sharing responsibilities across various teams.
-
Perform data profiling and analysis on both structured and unstructured data.
-
Design and map ETL/ELT pipelines for new or modified data streams, ensuring integration into on-prem or cloud-based data storage.
-
Automate, validate and maintain ETL/ELT processes using technologies such as Databricks, ADF, SSIS, Spark, Python, and Scala.
-
Proactively identify design, scope, or development issues and provide recommendations for improvement.
-
Conduct unit, system, and integration testing for ETL/ELT solutions, ensuring defects are resolved.
-
Create detailed documentation for data processes, architectures, and workflows.
-
Monitor and optimize the performance of data pipelines and databases.