Data Engineer
Remote
If you share our values, you should have:
At least 9 years' experience in software engineering
At least 2 years' experience with Go
Proven experience (2 years) building and maintaining data-intensive APIs using a RESTful approach
Experience with stream processing using Apache Kafka
A level of comfort with Unit Testing and Test-Driven Development methodologies
Familiarity with creating and maintaining containerized application deployments with a platform like Docker
A proven ability to build and maintain cloud based infrastructure on a major cloud provider like AWS, Azure or Google Cloud Platform
Experience data modeling for large scale databases, either relational or NoSQL
Bonus points for:
Experience with protocol buffers and gRPC
Experience with: Google Cloud Platform, Apache Beam and or Google Cloud Dataflow, Google Kubernetes Engine or Kubernetes
Experience working with scientific datasets, or a background in the application of quantitative science to business problems
Bioinformatics experience, especially large scale storage and data mining of variant data, variant annotation, and genotype to phenotype correlation