Job Description :

As part of the Data Management area, the Data Engineer will provide leadership in the conceptualization and design of innovative solutions for data acquisition, movement, security, lineage and data governance. In doing so, they will be responsible for the design and deployment of data architecture frameworks and data pipelines that will ensure integration of governed data across the enterprise which will act as the foundational layer for Edward Jones analytics for years to come. The Data Engineer works with the business analysts, software engineers, data analytics teams, data scientists, and data warehouse architect's in order to enable enterprise analytic capabilities by implementing data warehouse/analytic hub solutions. This position requires excellent communication skills and the ability to engage and present to varying levels of leadership.

  • The data engineer will focus on design of the Analytic Hub architecture and data pipelines based on business need and technology efficiency.
  • Leads projects and small teams of ETL developers/engineers on complex efforts
  • Assists in troubleshooting complex issues/efforts with other team members or cross divisional teams, leveraging seasoned judgment, technical and business acumen.
  • Can easily assume a team leadership responsibilities.
  • Ability to quickly establish themselves as creditable resource for understanding new technologies and architectures related to data management solutions.
  • Work closely with QA, business/product analysts and customers to design and implement new feature requests
  • Ensures standards and directional adherence across the various technical domains.
  • Work with internal and external technology experts to ascertain system functional capacity, constraints, and support lifecycles. Evaluate and make recommendations to ensure technology investments are optimized.
  • Proactively identify unnecessary complexity and potential failure points, assist with creating plans to reduce or eliminate where appropriate.
  • Build and maintain strong relationships with technology teams and business partners to facilitate improved alignment of technology initiatives to business strategies
  • Design and Build Analytic Hub and BI solutions using technologies like Snowflake, Oracle, SQL Server, RDS PostgreSQL, ETL/ELT Tools, BI servers, Report development Tools, data event streaming pipelines

Qualifications Required in the Job:

  • Advanced working SQL/PLSQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases structures (Hadoop, MongoDB, dimensional designed solutions, and structures)
  • Experience building and optimizing data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets and recommending transformations to optimal architectures.
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
  • A successful history of manipulating, processing, and extracting value from large, disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
  • Experience supporting and working with cross-functional teams in a dynamic environment with an agile mindset, agile methodology, sprint planning, writing user stories and delivering data pipelines in an iterative methodology.
  • Candidates should also have 5+ years of experience using the following software/tools:
    • Cloud-based data storage and data pipeline solutions
    • API development (specifically to connect to SaaS solutions)
    • Hadoop, Spark, Kafka, Linux scripting, etc.
    • Experience with AWS/Azure cloud services and platforms
    • Experience with Snowflake data platform
    • Experience with Matillion, Databricks, or equivalent
    • Experience with object-oriented/object function scripting languages: Python, Java, etc.
  • Data Engineer role who has attained a degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field experience.
  • Data modeling experience using either SAP Power Designer or Erwin tools.
  • Snowflake cloud-based data warehousing platform.
  • Experience using agile development methodology and working within a scrum team.
  • Data modeling experience using either SAP Power Designer or Erwin tools.
  • Experience using agile development methodology and working within a scrum team.

Required Skills : ETL,SQL, cloud , informatica, python
Basic Qualification :
Additional Skills :
Background Check :Yes
Drug Screen :Yes
Notes :
Selling points for candidate :
Project Verification Info :
Candidate must be your W2 Employee :Yes
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set : :No
Type of project :Development/Engineering
Master Job Title :ETL Developer
Branch Code :St. Louis
             

Similar Jobs you may be interested in ..