Job Description :

KEY SKILLS: 

  • Strong experience with Data Bricks, Hadoop, Data Factory Data Fusion, Kafka, Python must have at least 3 years
  • Experience with BigQuery and the Snowflake platform.
  • Strong data modeling experience
  • Azure, Snowflake, AWS certifications are preferred

Responsibilities:

• Determines strategy for and leads projects to build and maintain batch and event driven ETL/ELT solutions, creating and maintaining data flows and data catalogs for optimal ingestion, transformation, and loading of data from a wide variety of data sources, at times guiding other team members to accomplish project objectives.

• Responsible for the design and implementation of scalable and complex cloud-native data ingestion frameworks and API integrations for consuming real-time, high volume and complex streaming data leveraging ‘big-data’ technologies on various cloud platforms.

• Leads projects to design, implement and maintain enterprise data lake, warehouse and BI solutions leveraging technologies like SQL, Hadoop, and MPP data platforms.

• Evaluates data threat models (both cloud and on-premises), security postures and ensuring all security considerations are accounted for (inclusive of access controls, encryption, masking and classification) and coaches others on the team on data security best practices.

• Leads the design and implementation of advanced analytics solutions, including tools for analytics, data science and AI/Machine Learning, working closely with stakeholders at senior levels in the company to help them optimize their data and BI strategies.

• Guides software development team members during design and development to partner on designing the data architecture and enable the application's success.

• Drives continuous improvement in team operation and delivery by designing and supporting CI/CD processes, designing and implementing methods to improve data reliability and quality, and preparing data for prescriptive modelling. Identifies and implements industry best practices for the team to follow.

Qualifications:

• The team member should be customer centric, self-driven and comfortable in a fast-paced environment with a large and diverse technical portfolio.

• The Engineer is a recognized subject matter expert in data modelling and building analytical applications on on-prem and cloud platforms, with expertise in integrating and migrating data between infrastructures of varying types.

• Expertise in both cloud and on premises infrastructure components with the knowledge and ability to archive optimum efficiency, performance, flexibility and cost effectiveness.

• Expertise with practical application experience using various bigdata platforms like Azure/GCP/AWS.

• Strong experience in building data pipelines leveraging technologies like ADF, Databricks, Datafusion, Kafka, Eventhub etc.

• Advanced expertise in SQL and strong skills in two or more programming languages, such as Python or Java.

• Demonstrated expertise in building modern data lake solutions leveraging multiple providers, such as Azure ADLS, Google cloud storage, Hadoop/HDFS, S3 etc.

• Proven expertise in data/dimensional modeling and segmentation on MPP platforms (Snowflake, BigQuery) as well as traditional relational databases.

• Strong written and verbal communication skills, with demonstrated experience partnering with executive level business customers and guiding the work of others.

• Ability to identify, capture and influence user/business needs and translate them into technical requirements.

• Expertise in automated test/validation solutions.

• Accountable for creating appropriate documentation such as data catalogs, data flows and data architecture diagrams

Experience:

• Bachelor’s Degree in computer science, software/computer engineering, applied mathematics, or physics, statistics, or equivalent experience

• 10+ years of prior relevant experience

• Work samples demonstrating the ability to solve complex data challenges in novel and innovative ways, balancing multiple considerations to drive business value, and leading organizational change.

• Certifications: Data Engineering Cert (Azure and Snowflake Good to have: AWS or GCP)

             

Similar Jobs you may be interested in ..