Job Description :
Data & Analytics Sr. Consultant / Technical Architect
6 months +

Overall industry experience: 12+ Years

The candidate is expected to be a Sr. Consultant / Technical Architect who can work with customers and teams (local or remote) and lead technical or business discussions and provide technical guidance to clients and team members delivering initiatives. An ideal candidate must have been in a Senior Architect/Consultant / Engineer role where she/he has led the design/implementation of several complex data-oriented solutions on Azure Data Platform with multiple integrations and reporting needs. Additional experience in On Prem and other Cloud Data technologies will be helpful.

The role is a combination of solutioning, consulting and hands-on experience. The ideal candidate will need to envision, design, and implement the technical solution working with customer and internal technical team. Candidate will lead multiple Data and Analytics projects providing design/implementation guidance to teams.

The candidate is expected to have the following expertise/experience.

Minimum 10+ years' experience in architecture and design of highly scalable, performance, resilient data and analytical solutions

Modern Data Platform

  1. Excellent written and verbal communication
  2. Must have worked on several (5+) projects building solutions for Data Pipelines, Data warehousing, Data Modelling on Azure Data Platform (and AWS, GCP or other Appliances and DW technologies - good to have)
  3. Expertise in design and development of Data pipelines and ETL using ADF, Databricks to move data from relational/structured/unstructured data from source to data lake to data warehouse, on Azure
    1. Creating Pipelines and integrations of multiple data sources using Azure data factory and other integration technologies
    2. Batch and Streaming (Kafka, Spark Streaming, Stream Analytics)
    3. Big data processing and transformation using Databricks (Scala/Python)
    4. Expertise in optimizing cost and performance of data pipelines
  4. Experience working with Data Lake, Synapse DW on Azure
    1. Designing Data Lake storage layer (Landing/staging, raw, trusted/curated zones)
    2. Designing data storage DB schema, data models and processing on Azure Synapse DW
  5. Experience in Data Modelling
    1. Building Data Models from structured and unstructured data (Data Model, Common Data Model etc.)
    2. Performance optimization of data model for highly performance analytical and reporting workloads
  6. Expertise with SQL Server and extensive experience with SQL programming
    1. Programming and optimizing DB objects Views, Stored Procs, Functions
  7. Experience with Big Data processing, using Big Data technologies on Azure platform
  8. Experience integrating AI skills/ML models with Data and reporting solutions
  9. Hands on experience working with
    1. Spark In-memory capabilities and its modules: Spark Core, Spark SQL, Spark Streaming (Databricks / Apache Spark)
    2. Sqoop, Kafka and Hive
    3. Big Data streaming applications using Kafka and Spark Streaming
    4. Scala and/or Python
  10. Have good experience working with Python (Pandas, NumPy, Matplotlib etc. packages), to perform
    1. Data cleaning,
    2. Data analysis and
  11. Experience in building API layer for downstream consumption

Reporting Good to have

  1. Expertise in building scalable high performance reporting solution using Power BI, in both shared and dedicated capacity
      • Designing Reports against cloud and on-prem data sources
      • Power BI Dataflows
      • Performance analysis and optimization of slow performing reports/visuals
      • Import models and Direct Query
      • Expertise in integrating Power BI reports with custom applications using Power BI Embedded or Premium
      • Experience working with Power BI Paginated reports
      • Building data models using SSAS Tabular or Azure Analysis Services
      • Building data models in Power BI
      • Hands-on with DAX
  2. Experience with C# development web and services

Others Good to have

  1. MS Certified Azure Data Engineer Associate
  2. Experience implementing analytical workload using Snowflake
  3. Experience building data solutions on AWS and GCP platforms
  4. Experience in working with other NoSQL databases (Cassandra, Mongo etc.)
  5. Experience in other BI and reporting tools Qlik, Tableau, BOBJ, SSRS, Excel reporting etc.
  6. Experience in developing Machine Learning model using Python (or R)
  7. Experience with Power Platform

Similar Jobs you may be interested in ..