Job Description :
Description:





Description:

We are looking for an experienced senior data engineer with software engineering skills to help us build critical data pipelines at massive scale, architect and build our big data warehouse and data mart systems, and develop models to facilitate real-time or near real-time applications, faster operational and historical analytics

Responsibilities


Participate in the full development life cycle of Data Warehouse and Data Mart systems
Design and implement ETL frameworks for Data Warehouse system using technologies like Hive, Spark and Java
Design, build and launch scalable, extremely efficient and reliable data pipelines to move data (both large and small amounts from diverse sources) into and out of Data Warehouse and Data Mart systems
Design and implement data warehouse query engine, implement our proprietary cutting edge Hadoop-based data warehouse systems
Gather and document data mart reporting and other requirements to meet business needs
Define and promote best practices and design principles for data warehousing techniques and architecture, while improving data organization and accuracy processing through data governance framework
Monitor and troubleshoot performance issues on data warehouse and data mart servers




Desired Skills & Experience


5+ years working experience in data warehouse development and architecture. 4+ years of experience in Big Data, Data Warehouse or Large Scale Cloud systems
5+ years ETL development/operations/reporting from multiple sources, using appropriate tools
5+ years advanced SQL skills, including stored procedures, functions, indexes, and views
A track record of crafting, implementing and delivering scalable, performant data pipelines and data services
Expert level software development experience using Java and/or Python
Expertise with dimensional warehouse data models (star, snowflake schemas)
Deep knowledge and hands-on experience with relational databases (Greenplum  and Oracle, MySQL/Postgres etc) and database schema design
Extensive experience with Hadoop, MapReduce/Yarn, Hive and HBase
Knowledge of automation and orchestration platforms such as Airflow
BS/MS Degree in Computer Science or related field
Excellent inter-personal and teamwork skills, having experiences working with oversea teams is a plus











Candidate Details:Rate flexible depending on experience, please submit competitively.
             

Similar Jobs you may be interested in ..