Job Description :
Job Requirement:
-5+ years of hands-on experience working in data warehousing, data architecture and/or data engineering environments at an enterprise scale
-5+ years of SQL (MySQL, AWS Redshift, Hive, Snowflake etc) development experience is required, No-SQL experience is a major plus
-4+ years of Python or Ruby programming experience is necessary
-3+ years working experience on Map Reduce, Big-Data processing frameworks.
-Strong experience in custom or structured (ie. Informatica/Talend/Pentaho/Fivetran) ETL design, implementation and maintenance
-2+ years of AWS (EMR/Lambda) experience
-Experience implementing and working withworkflow schedulers like Airflow, Luigi, Oozie etc.
-Strong experience writing complex SQL queries
-Experience implementing operational best practices such as monitoring, alerting, metadata management
-Excellent written and verbal communication and interpersonal skills, able to effectively collaborate with technical and business partners
-BS or MS degree in Computer Science or a related technical field
-Candidate needs to be strong in Python, SQL, and Data Modelling