Job Description :

Overview: Project began in January 2022, shifting from waterfall to agile methodology. There is no production release yet, this role is to assist a current employee who wears 5 different hats. Release of product will be in a couple months, and team structure will be reassigned. This role is more on the engineering Cloud (Azure Devops) space, not looking for someone on the reporting side. Will focus on migration and building out the Cloud. Senior role would be a plus however open to all levels.

JOB SUMMARY: The data engineer I is responsible for designing, developing, implementing, and supporting data warehouse and cloud storage solutions that support company analytics.

Must Haves:

  • Azure DevOps
  • Databricks
  • Data Factory

Nice to Have:

KEY SELECTION/CRITERIA - Minimum qualifications include:

  • Bachelor's degree or equivalent work experience in Computer Science, Management Information Systems (MIS), Information Technology (IT), or related field
  • 2+ years of experience in Structured Query Language (SQL) programming. Preferred experience other programming languages.
  • 2+ Experience with cloud services (AWS, Azure (preferred), Google Cloud)
  • Familiarity with business intelligence tools such as Power BI, Tableau, MicroStrategy, Business Objects, DAX, and Power Query preferred
  • Strong analytical skills, detail oriented, and organized
  • Strong communication skills and self-motivated


  • Consult with business counterparts to understand new data requirements. Design data models to support the requirements.
  • Data profiling and source system analysis to present insights to peers and business partners to support the end use of data
  • Collaborate with senior engineers and architects to ensure data models fit within the company data and systems architecture.
  • Develop, test, and implement Extraction, Transform and Load (ETL) processes to acquire and load data from internal and external sources to the data lake or data warehouse to be used for analytical purposes.
  • Design, build and test data products based on feeds from multiple systems using a range of different storage technologies and/or access methods
  • Monitor and support ETL jobs. Research poor performing ETL jobs and collaborate with database administers and other resources to improve the overall efficiency and stability of the data warehouse environment.
  • Support and partner with business analytics users by identifying relevant data and delivering views, cubes, models, and other semantic objects to ensure ease of access to data for non-technical individuals.
  • Deliver data solutions in accordance with agreed organizational standards that ensure services are resilient, scalable and future-proof
  • Provide technical and project documentation, utilizing agile project management methodologies.


  • Actively pursues personal continuous learning, development of skills and knowledge in job-related technical and professional areas
  • Support corporate efforts for safety and government compliance
  • Support and follow all corporate policies and procedures
  • Perform other related duties as required and assigned
  • Promote the values of a diverse workforce

Required Skills : Azure Databricks Data Factory
Basic Qualification :
Additional Skills :
Background Check :Yes
Notes :This role is remote.
Selling points for candidate :
Project Verification Info :
Candidate must be your W2 Employee :No
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set : :No
Type of project :Migration
Master Job Title :Data Scientist
Branch Code :Milwaukee

Similar Jobs you may be interested in ..