Job Description :
Experience of working in engineering divisions of product companies or Start-ups in the past is a must.
Master Degree in Computer Science or a bachelor’s degree with significant relevant experience is required
Excellent background in computer science fundamentals, data structures, and algorithms;
Strong background on building backend or infrastructure software leveraging scripting languages like python, core java, Ruby etc.
Good knowledge of concurrency and multi-threading concepts
Essential skills:
1. 5+ years of Python or Core Java or Ruby programming experience
2. Apache Hadoop eco-system experience along with deep knowledge in how Hadoop internals work
Secondary skills:
1. 2-3 years of SQL (Oracle, Hive, etc experience or No-SQL experience is a plus;
2. Experience in building S3/EMR/Glue based data ingestion pipeline on AWS is a plus