Job Description :

Client: Data Engineer,

Location: 100% REMOTE

Duration 12 Plus months

Role Description

As a data engineer, you bring software engineering best practices to production and maintenance of analytics code and bring an engineering mindset to discussions on how data is modeled from its source to its use in the data warehouse as business data & reporting data. You will be responsible for designing and implementing new AWS-based data solutions – new data processing, datasets, and systems to support various advanced analytics needs. This involves working with the existing engineering team, data scientists, analysts, and the business to understand requirements and data needs and definitions, all the while thinking creatively about what data can be best exploited to solve a wide array of business problems. You will create data flows to integrate with multiple internal and external sources including streaming, APIs, database connections, and flat files. You will liaise with members of the wider Data & Analytics teams and business teams to ensure alignment with existing systems and consistency with internal standards and best practices. If you’re an individual who thrives in a fast-paced environment and wants to help build a best-in-class data platform practice from the ground up, then this is the role for you.

Tasks and Responsibilities:

  • Capture business requirements for analytics and translate complex ones into technical requirements. Collaborate with teams to design & implement end-to-end solutions.
  • Design and build well-engineered data systems and services to support data analytics using AWS cloud services and Snowflake DWH.
  • Implement data pipelines and modern ways of automating ELT data pipelinesusing orchestration tools.
  • Own data model and test the data produced in order to ensure it is of high quality.
  • Be part of discussions with product managers and analysts in order to guide them in their understanding of the data in the data lake, shape the product solutions and to better grasp the context of requirements coming your way.
  • Use SQL queries to transform data in our data lake in order to move it from raw nuggets into reliable business entities and then into reporting aggregates. Identify dependencies for these transformations. Schedule these transformations on our platform. Investigate discrepancies in data.
  • Assure accuracy of data processing and outputs through consistently high software development skills, adherence to best practice, thorough testing, and peer reviews.
  • Provide production support for Data Warehouse issues such as data load problems, transformation translation discrepancies.
  • Lead some refactoring of our data warehouse where needed, in order to make data more consistent, better documented and the pipelines more resource-efficient.
  • Documents analytics datasets and any business logic.
  • Habitually approach problem solving with creativity and resourcefulness; carefully evaluate

risks and determine correct courses of action when completing tasks.

Experience and Skill requirements:

  • Bachelor’s Degree in Computer Science, Engineering, Math or closely related discipline.
  • 5+ years, demonstrable, hands-on professional software development skills using Java or

Python.

  • Demonstrable professional experience designing, building, and maintaining data systems

and processes using cloud-based platforms such as GCP, AWS, including working

knowledge in Unix/Linux OS, shell scripting, and tools.

  • A solid experience and understanding of architecting, designing, and operationalization of

large-scale data and analytics solutions on Cloud Data Warehouse such as Snowflake or

Google BigQuery is a must.

  • Expertise in using AWS cloud-based systems and services to acquire and deliver data.
  • Excellent SQL knowledge and hands-on experience with the ability to create efficient data

models (applied to data warehousing in particular).

  • Experienced with ELT processes to transform data, set up and schedule jobs with DBT,

Python, Airflow, and cron.

  • Experienced with building web services with Rest, GraphQL or GRPC.
  • Experienced in CI/CD practices including unit testing, automation testing, data migration,

code quality, performance, and integration/systems testing.

  • Demonstrated willingness and ability to effectively work with various team members when

gathering requirements, delivering solutions, and eliciting suggestions and feedback.

  • Extremely quick learner both in terms of new technical skills and acquiring domain

knowledge.


** Excellent Communication is a MUST**

========================================================

eDataForce consulting LLC is an Equal Opportunity Employer

 ===============

 

 

 

 

 

 

 

 

 

 

             

Similar Jobs you may be interested in ..