Job Description :

Job Summary :
The Senior Data Engineer plays a pivotal role in driving engineering innovation and supporting key business initiatives. Acting as a bridge between IT, business, analytics, and data science teams, this role is responsible for building scalable data pipelines, optimizing data environments, and ensuring data governance. The ideal candidate will bring strong technical expertise, excellent communication skills, and a passion for simplifying and modernizing data processes at scale.
Core Responsibilities :
Lead the design and development of scalable ETL solutions and data architecture in Snowflake.
Display strong thought leadership and execution in pursuit of modern data architecture principles and technology modernization.
Engage in architectural and deployment discussions to ensure solutions are designed for successful scale, security, and high availability in the cloud or on prem.
Define and lead technology proof of concepts to ensure feasibility of new data and cloud technology solutions.
Support and enhance data architecture and define database schemas in Snowflake and other analytic environments.
Manage performance-tuning of data processes as well as troubleshooting data processing issues.
Develop rapid prototyping and design processes for fast solution delivery to the business.
Partner with leadership, product, and compliance to ensure data structures meet evolving business and regulatory requirements.
Assist in ensuring data governance policies are deployed within company data warehouse systems in accordance with business standards and best practices.
Mentor, motivate, and coach team members on technical best practices (i.e., data modeling, database design, ETL design, job scheduling and monitoring, etc.) and inspire professional development.
Success Metrics
Improved Snowflake query performance and pipeline runtimes
Controlled Snowflake spend with predictable costs during peak usage periods.
Successful implementation of role-based access and security policies for sensitive data.
Clear documentation, standards, and reusable data products adopted across the data team.
Reduction in downstream data defects and reprocessing events.
Required Qualifications Education & Experience:
Bachelors degree in Computer Science, Data Science, or related field.
8+ years of experience as a Data Engineer, ETL Developer, or Data Warehouse DBA.
Experience in finance, operations, software, or supply chain environments.
Experience designing and implementing big data and analytics solutions in financial services is highly preferred. Skills & Competencies:
Proficiency with SQL and Python.
Experience with cloud data tools and architecture (Snowflake, Azure Data Factory, Data Lake, Synapse).
Experience managing and optimizing cloud data warehouse costs.
Familiarity with analytic tools (Power BI, Tableau, Cognos).
Experience with traditional ETL tools (IBM DataStage, SSIS, Informatica).
Knowledge of distributed architectures (SOA, RESTful APIs, data integration).
Strong critical thinking, analytical, and problem-solving skills.
Highly organized and detail-oriented.
Ability to manage multiple tasks in a fast-paced environment.
Excellent interpersonal and communication skills.

We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.

             

Similar Jobs you may be interested in ..