Job Description :

Keys are:
- Azure Data Factory
- Azure Databricks
- PySpark & SparkSQL

Please DO NOT send candidates who do not have all required experience. The interview process is 3 people who are each experts on either Pyspark, Databricks and ADF and they will grill them on these areas.

Client: Retail Business Services (Ahold)
Location: 100% Remote
Start Date: 2-3 weeks' from offer
Duration: 6 Months+
- They will consider candidates who have only a few years experience at lower rate but they still need to have Pyspark, Databricks, ADF experience.
Interview Process: Apex Tech Screening followed by one and done interview with client team.


Position Summary
Plays a critical role in the design, development, and roll-out of the Data & Analytics Platform, along with partnering with team members to develop the data consumption tools for the Data & Analytics Platform. Engages through the entire lifecycle of all the projects within Business Intelligence & Analytics department from data modeling, to data engineering, and finally data consumption. Becomes a subject matter expert on the Data & Analytics Platform. Delivers reporting and analysis to the enterprise.
Principle Duties and Responsibilities
Develop data engineering processes in azure using components like Azure Data Factory, Azure Data Lake analytics, HDInsights, DataBricks etc.
Works with business users to design, develop, test, and implement business intelligence solutions in the Data & Analytics Platform.
Solves moderate to complex application errors, as well as designs and resolves application problems, following up in a timely manner with all appropriate customers and IT personnel.
Documents all phases of work including gathering requirements, joining relationship diagrams, creating database diagrams, report layouts and other program technical specifications using current specified design standards for new or revised solutions.
Assists in assessing alternative software solutions for workability and technical feasibility.
Relates information from various sources to draw logical conclusions.
Assists in identifying the impact of proposed application development/enhancements projects.
Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources.
Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs.
Reviews and validates QA test plans and supports QA team during test execution.
Participates in designing data transformation and data pipe lines
Ensures change control and change management procedures are followed within the program/project as they relate to requirements.
Documents reporting requirements through engagement with business process SMEs.
Creates source to target data mappings for data pipelines and integration activities.
Designs and develops reports, dashboards to support decision-making within the business and IT.
Conducts unit testing on ELT and report development.
Conducts data lineage and impact analysis as a part of the change management process.
Delivers SDLC Deliverables as defined.
Contributes to Business Case Development through information gathering and analysis.
Basic Qualification
Bachelor's degree in Computer Science, Information Systems, Business Administration, or other related field.
Minimum of 7+ years IT experience encompassing the following:
o Working experience with SDLC and deliverables associated with each phase.
o Experience in Data Modeling and Advanced SQL techniques
o Experience engineering data pipelines using latest technologies and techniques
o Experience in Data Visualization Best Practices
o Experience working with industry-leading Business Intelligence tools
o Experience working with industry-leading database technologies
o Experience in Azure Cloud technologies
o Experience in Big Data / Hadoop
Technical Competencies
Develop Business Intelligence applications/dashboards/Reports in Microstrategy and Power BI.
Develop complex SQL/USQL/pySpark code for Data Engineering pipelines in Azure Data Lake analytics and azure data factory.
Develop complex data science algorithms using R programming language and industrialize them.
Develop and Implement big data applications using Hadoop components (HDInsight - Microsoft Azure).
Create requirement documents, review the functional design documents created by vendor partners as a part of software development life cycle.
Work closely with business teams in every stage from gathering requirements to closure of project.
Work with the key stakeholders to create requirements for each business solution, ensure the requirements are developed in concert with and agreed upon by the business partner, and that the solution delivers the agreed upon business needs.


Required Skills : Keys are: Azure Data Factory Azure Databricks PySpark & SparkSQL
Basic Qualification :
Additional Skills :
Background Check :Yes
Notes :Keys are: - Azure Data Factory - Azure Databricks - PySpark & SparkSQL Client: Retail Business Services (Ahold) Location: 100% Remote Start Date: 2-3 weeks from offer Duration: 6 Months+ Position Summary Plays a critical role in the design, development, and roll-out of the Data & Analytics Platform, along with partnering with team members to develop the data consumption tools for the Data & Analytics Platform. Engages through the entire lifecycle of all the projects within Business Intelligence & Analytics department from data modeling, to data engineering, and finally data consumption. Becomes a subject matter expert on the Data & Analytics Platform. Delivers reporting and analysis to the enterprise. Principle Duties and Responsibilities Develop data engineering processes in azure using components like Azure Data Factory, Azure Data Lake analytics, HDInsights, DataBricks etc. Works with business users to design, develop, test, and implement business intelligence solutions in the Data & Analytics Platform. Solves moderate to complex application errors, as well as designs and resolves application problems, following up in a timely manner with all appropriate customers and IT personnel. Documents all phases of work including gathering requirements, joining relationship diagrams, creating database diagrams, report layouts and other program technical specifications using current specified design standards for new or revised solutions. Assists in assessing alternative software solutions for workability and technical feasibility. Relates information from various sources to draw logical conclusions. Assists in identifying the impact of proposed application development/enhancements projects. Conducts data analysis (SQL, Excel, Data Discovery, etc.) on legacy systems and new data sources. Performs data profiling and process analysis to understand key source systems and uses knowledge of application features and functions to assess scope and impact of business needs. Reviews and validates QA test plans and supports QA team during test execution. Participates in designing data transformation and data pipe lines Ensures change control and change management procedures are followed within the program/project as they relate to requirements. Documents reporting requirements through engagement with business process SMEs. Creates source to target data mappings for data pipelines and integration activities. Designs and develops reports, dashboards to support decision-making within the business and IT. Conducts unit testing on ELT and report development. Conducts data lineage and impact analysis as a part of the change management process. Delivers SDLC Deliverables as defined. Contributes to Business Case Development through information gathering and analysis. Basic Qualification Bachelor s degree in Computer Science, Information Systems, Business Administration, or other related field. Minimum of 7+ years IT experience encompassing the following: o Working experience with SDLC and deliverables associated with each phase. o Experience in Data Modeling and Advanced SQL techniques o Experience engineering data pipelines using latest technologies and techniques o Experience in Data Visualization Best Practices o Experience working with industry-leading Business Intelligence tools o Experience working with industry-leading database technologies o Experience in Azure Cloud technologies o Experience in Big Data / Hadoop Technical Competencies Develop Business Intelligence applications/dashboards/Reports in Microstrategy and Power BI. Develop complex SQL/USQL/pySpark code for Data Engineering pipelines in Azure Data Lake analytics and azure data factory. Develop complex data science algorithms using R programming language and industrialize them. Develop and Implement big data applications using Hadoop components (HDInsight - Microsoft Azure). Create requirement documents, review the functional design documents created by vendor partners as a part of software development life cycle. Work closely with business teams in every stage from gathering requirements to closure of project. Work with the key stakeholders to create requirements for each business solution, ensure the requirements are developed in concert with and agreed upon by the business partner, and that the solution delivers the agreed upon business needs.
Selling points for candidate :
Project Verification Info :Apex Letter Client Letter Exhibit A
Candidate must be your W2 Employee :No
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set : :No
Type of project :Architecture/Design
Master Job Title :Data Analyst
Branch Code :Philadelphia
             

Similar Jobs you may be interested in ..