Job Description :
RQ00371 - Task-Based I&IT Consultant Senior Toronto, ON Start Date 2021-02-15 End Date 2021-09-30 Assignment Type: Remote Security Level: No Clearance Required # Business Days: 188.00 Description MSP Notes Shortlisting Date: Monday, January 25th at 2:00 pm EST Maximum number of Candidate Submissions: 2 (Two) Must-haves: 5+ years of demonstrated knowledge of master data management methods in real world implementations Demonstrated experience automating data pipelines using Azure and/or other technologies (Python, Databricks, Azure Storage, Azure Data Lake, Azure SQL DB) Note Assignment Type: This position is currently listed as "Onsite" due to COVID-19 related WFH direction. Once OPS staff are required to return to the office, the resource under this request will be required to work onsite as well. Responsibilities: The consultants will design, implement, and perform required knowledge transfer of the deliverables noted below, for the following approved MLTSD360 Self-Serve Analytics and Golden Record work which has been approved and budgeted for under the IT Investment Plan. Specific deliverables include bringing the required data into the cloud environment, making the required connections between the data sets, and making the data available in an easy to consume format. Details of these deliverables are below: Data Ingestion o Design and implementation of raw data storage and ingestion mechanism o Build Data Pipelines to ingest raw transactional data from the source system Data Mapping o Designing and implementation of business rules to create an official Golden Record Data Pipelines and Semantic Model: o To be performed in logical iterations o Data Pipelines and Semantic Model will be based on set requirements o Augment Data Pipelines to transform and move raw data into the Semantic Model o Semantic modelling will determine how data structures will be available, combined, processed, pre-calculated and stored Knowledge Transfer Sessions and documentation for technical staff related to designing and implementing the above stated end to end analytics solutions Skills Experience and Skill Set Requirements Master Data Management - 35% The candidate must demonstrate their experience with master data management methods in real world implementations Data Pipelines - 25% The candidate must demonstrate their experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (Python, Databricks, Azure Storage, Azure Data Lake, Azure SQL DB) Data Transformations- 25% The candidate must demonstrate their experience with complex data transformations Knowledge Transfer - 15% The candidate must demonstrate experience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting, designing, and implementing end to end analytics solutions
             

Similar Jobs you may be interested in ..