Job Description :
The Role:
The SBG Data Analytics & Science team is looking for a seasoned Data Engineer/Technical Analyst, with experience in AWS, to don two hats: developing data ingestion & processing pipelines and building automated frameworks for development, deployment & testing, for Analysts.
As a Data Engineer/Technical Analyst, you’ll need to work with analysts for their raw data processing needs, understand the requirements, build and maintain processing pipelines.
You’ll also help analysts migrate to AWS seamlessly by working with them, maintaining an inventory of analytics jobs, scripts and queries to ensure smooth migration.
You’ll also help to build frameworks and utility tools that’ll make analyst life easy in AWS.
The ability to not only code and configure tools, but also assemble & integrate technology across different platforms is critical to success.
This includes a deep understanding of AWS platforms, scripting and enterprise level languages as well as open source tools that can and should be leveraged to solve problems.
You’ll work in a rapid environment where there aren’t always clear specifications or rules about how something should be done and you’ll be responsible to figure things out and keep things moving.

Primary Responsibilities:
Develop and maintain existing ingestion & processing pipelines that processes raw data into shareable data layers for wider analyst consumption
Work with analysts and our larger data engineering group to ensure smooth migration of data, analytics tools and our jobs and scripts to AWS
Design and build automated code development & deployment systems that simplify analyst work and make our work more consistent and predictable. Eg: Maintaining GitHub and using Jenkins for automate code deployment in AWS
Analyze data-related systems integration opportunities and challenges, validating data models from a data integration perspective for current projects as well as other impacted projects and drive quality improvements
Help Proof-of-Concept tools & technologies that can ease analyst work
Train/guide larger analytics community about best practices, usage of tools & technologies
Communicate progress frequently; Escalate critical issues that block progress and resolve in a timely manner

Qualifications:
Experience working with AWS platform and tools like EMR, S3, Data Pipelines etc.
Advanced knowledge of SQL, Python or other scripting languages, Shell scripting
Knowledge of Big Data and related technologies: HDFS, Hive etc. (Spark, Presto etc. would be awesome!)
Understanding of Big Data tech stack and architecture
Great product/project management skills
Great communication skills – both at technical level, as well as to leadership
Have the right attitude to get things done
Have strong communication and negotiation skills