Job Description :

Title          : Data & Analytics – Consultant / Technical Architect

Location  : Santa Clara, CA

Duration : Longterm Contract

Rate: Open

Overall industry experience: 12+ Years

Location: US, WinWire HQ, Santa Clara, California (Anywhere in US)

The candidate is expected to be a Consultant / Technical Architect who can work with customers and teams (local or remote) and lead technical or business discussions and provide technical guidance to clients and team members delivering initiatives. An ideal candidate must have been in an Architect/Consultant / Engineer role where she/he has led the design/implementation of several complex data-oriented solutions on Informatica and Azure Data Platform with multiple integrations and reporting needs. Additional experience in On Prem and other Cloud Data technologies will be helpful.

The role is a combination of solutioning, consulting and hands-on experience. The ideal candidate will need to envision, design, and implement the technical solution working with customer and internal technical team.

The candidate is expected to have the following expertise/experience.

Minimum 5+ years’ experience in architecture and design of highly scalable, performance, resilient data and analytical solutions

Azure Data Platform

1.        Excellent written and verbal communication

2.        Must have worked on several projects building solutions for Data Pipelines, Data warehousing, Data Modelling on Azure Data Platform

3.        Expertise in design and development of Data pipelines and ETL using ADF, Databricks to move data from relational/structured/unstructured data from source to data lake to data warehouse, on Azure

4.        Experience working with Data Lake, Synapse DW on Azure

5.        Expertise with SQL Server and extensive experience with SQL programming

6.        Experience with Big Data processing, using Big Data technologies on Azure platform

7.        Experience in building API layer for downstream consumption


1.        4-6 years of Experience in Informatica Power Center 9x ETL - Designer, Repository Manager, Workflow Manager, Job Schedulers

2.        Experience on developing mappings in Informatica PowerCenter, BDM and IICS Should have good knowledge in writing SQL queries

3.        Must have experience in building complex SQL queries and Shell scripting

4.        Strong on DW Fundamentals, concepts, understands and review technical requirements, designs ETL process flow and walkthroughs and performs data profiling of required source systems Excellent understanding of various RDBMS (Teradata Would be an added advantage)

5.        Experience in designing Metadata driven Mappings

6.        Understanding the Teradata as source / Destination and integration using Informatica would be added advantage

7.        Experience with Azure DevOps and CI/CD implementation

Others – Good to have

1.        MS Certified – Azure Data Engineer Associate

2.        Hands on experience working with

a.         Spark In-memory capabilities and its modules: Spark Core, Spark SQL, Spark Streaming (Databricks / Apache Spark)

b.        Sqoop, Kafka and Hive

c.         Big Data streaming applications using Kafka and Spark Streaming

d.        Scala and/or Python

3.        Have good experience working with Python (Pandas, NumPy, Matplotlib etc. packages), to perform

a.         Data cleaning,

b.        Data analysis and

4.        Experience integrating AI skills/ML models with Data and reporting solutions

5.        Experience building data solutions on AWS and GCP platforms

6.        Experience in working with other NoSQL databases (Cassandra, Mongo etc.)

7.        Experience in other BI and reporting tools – Qlik, Tableau, BOBJ, SSRS, Excel reporting etc.

8.        Experience in developing Machine Learning model using Python (or R)

9.        Experience with Power Platform


Similar Jobs you may be interested in ..