Job Description :

Data Solution Architect
The ideal candidate will have a deep understanding of Microsoft data services, including Azure Fabric, Azure Data Factory (ADF), Azure Synapse, and ETL/ELT processes. This role focuses on designing, developing, and maintaining cloud-based data pipelines and solutions to drive our analytics and business intelligence capabilities.
Key Responsibilities:
      • Provide technical leadership in modernizing legacy data ingestion, ETL/ELT, and databases to cloud technologies (AWS/Azure).
      • Demonstrate a self-driven, ownership mindset to navigate ambiguity, resolve constraints, and mitigate risks with minimal supervision.
      • Implement data access, classification, and security patterns that comply with regulatory standards (PII, locational data, contractual obligations, etc.).
      • Build strong relationships with technical teams through effective communication, presentation, and collaboration skills.
      • Collaborate with stakeholders, business analysts, and SMEs to translate business requirements into scalable solutions.
      • Integrate data from multiple sources into cloud-based architectures, collaborating with cross-functional teams.
      • Work closely with data scientists, analysts, and stakeholders to meet data requirements with high-quality solutions.
      • Function within a matrixed team environment, sharing responsibilities across various teams.
      • Perform data profiling and analysis on both structured and unstructured data.
      • Design and map ETL/ELT pipelines for new or modified data streams, ensuring integration into on-prem or cloud-based data storage.
      • Automate, validate and maintain ETL/ELT processes using technologies such as Databricks, ADF, SSIS, Spark, Python, and Scala.
      • Proactively identify design, scope, or development issues and provide recommendations for improvement.
      • Conduct unit, system, and integration testing for ETL/ELT solutions, ensuring defects are resolved.
      • Create detailed documentation for data processes, architectures, and workflows.
      • Monitor and optimize the performance of data pipelines and databases.
Required Skills and Qualifications:
    • 4+ years of experience in designing and implementing data warehouse and analytics solutions (on-premise and cloud).
    • 3+ years of expertise in data warehousing concepts (ETL/ELT, data quality management, privacy/security, MDM) with hands-on experience using ADF, Data Factory, SSIS, and related tools.
    • 3+ years of experience with cloud data and cloud-native data lakes/warehouses. Microsoft Azure services (Fabric Lakehouse, ADF, Data Factory, Synapse, etc.).
    • 2+ years of experience in Python, Scala, or Java for use with distributed processing and analytics, such as Spark
  • We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.

             

Similar Jobs you may be interested in ..