Job Description :
Keys:
Expertise in coding in Python and SQL
xperience scripting in SPARK, Airflow, etc.
AWS Redshift experience is desired
Tableau nice to have
Understanding of Opensource / Kafka nice to have

Day-to-day
Design and build robust data pipelines using scripting in SPARK, Airflow, Python and SQL.
Design data warehouse/data marts in AWS Redshift and other databases as appropriate.
Use Optimization techniques in data load and query processing
Validate and build audit, balance, and control of mission-critical data pipelines
Develop cool viz using Tableau and other open source Viz tools as needed
Identify best data sources among multiple sources to use for data pipelines to improve trust in data
Fix bugs, work collaboratively with team members

What you bring
Masters or equivalent in CS/Engineering or another comparable discipline
You have at least 6 years of technical experience and strong data warehouse & data modeling skills
Very strong skills in Python, SQL, SPARK, Redshift, Airflow, AWS
Familiarity with Agile methods (we use agile tools)
Experience with reporting tools like Tableau is a plus.
Team player, agile, highly accountable, curious, willing to learn, implement and teach
Ability to juggle multiple responsibilities and deliver to timelines
Experience in the consumer lending industry highly desired

Bonus
Experience with open source tools such as Kafka is a plus
Experience in any JVM based language
             

Similar Jobs you may be interested in ..