Job Description :
"As part of the International eCommerce Data Engineering team, you''ll be responsible for design, development and operations of large-scale data systems operating at petabytes scale. You will be focusing on real-time data pipelines, streaming analytics, distributed big data and machine learning infrastructure. You''ll interact with the engineers, product managers, BI developers and architects to provide scalable robust technical solutions.
Essential Job Functions & Responsibilities :
Min 6-8 years of BIG data development experience
Demonstrates up-to-date expertise in Data Engineering, complex data pipeline development
Design, develop, implement and tune large-scale distributed systems and pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.
Experience with Java, Python to write data pipelines and data processing layers
Experience in writing map-reduce jobs
Demonstrates expertise in writing complex, highly-optimized queries across large data sets
Proven, working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase
Highly Proficient in SQL
Experience with Cloud Technologies ( GCP, Azure)
Experience with Relational, NoSQL/in memory data stores would be a big plus ( Oracle, Cassandra, Druid)
Provides and supports the implementation and operations of the data pipelines and analytical solutions
Performance tuning experience of systems working with large data sets
Experience with clickstream data processing
Experience with meta data management tool like MITI, Monitoring tool like Ambari
Experience in developing REST API data service
Retail experience is huge plus