Job Description :
Position: Data Engineer (AWS & GCP)
Location: Palo Alto, CA
Duration: Long term

RESPONSIBILITIES
Design, build and maintain data pipelines in multi-cloud infrastructure (AWS and GCP)
Design and develop big data processing systems optimized for scaling (Apache Spark)
Develop and maintain real time data pipelines
Build software libraries, tools, server less applications and workflows (Java and Python)
Design mission critical dashboards and reports using BI tools
Internal process improvements such as automating manual processes, alerting systems, tooling, devops
Collaborate closely with product teams to build tools, frameworks, reports to run experiments, analyze AB test results
Work with analysts and data scientists to extract actionable insights from data that shape the direction of the company
Actively engage in design and code reviews - learn from your peers and teach your peers
Lead initiatives to research, analyze and propose new technologies and tooling for our stack
REQUIREMENTS
5+ years of software development or data engineering experience
Experience with Big Data, ETL and data modeling
Experience in developing and operating high-volume, high-availability environments
Previous experience with Linux, AWS, Docker and Kubernetes
Solid coder with Java, Python, Bash
             

Similar Jobs you may be interested in ..