Job Description :

We are seeking experienced Data Engineer with experience in Python and building data platforms with real time Streaming & Batch ingestions and ETL pipelines, preferably in Google Cloud Platform. Our technology stack includes Google Cloud Dataflow, BigQuery, Pub-Sub, Kafka, Cassandra, Spring, Micronaut, OpenShift, Kubernetes.

Skills Overview:
What are the top five skills and number of years of experience required to perform this job?
At least five to ten years of relevant systems software engineering experience and a BS/MS in computer science or engineering (or equivalent professional experience).
Experience in design and building high performance, highly resilient and scalable distributed systems.
Experience in building complex data pipelines using GCP dataflow or any other Streaming frameworks.
Experience with service development, SQL & NoSQL databases, Kafka or other Queuing technologies.
Experience with TDD, Agile, Paired Programming and Aggressive Refactoring.
The ability to prioritize effectively, communicate clearly, attention to detail and high level of commitment.

What are some preferred/nice to have skills the manager is looking for?
Any level of experience with programming languages like Groovy or Python is a plus.
Any level of experience with building Real time Stream Processing applications is a plus.
Any level of experience with Google Cloud Platform is a plus.
Any level of experience with NoSQL data stores (Redis, Cassandra, etc.) is a plus.
Knowledge of modern Deployment tools such as Docker, Openshift/Kubernetes.
Ability and aptitude to dig into and solve challenging problems and proactively avoid problems.
Enthusiastic and excited about technology.
Open and honest and willing to share and accept feedback and ideas.
Aren't afraid to roll up your sleeves and get your hands dirty with the rest of the team.
Strive to constantly improve yourself and your team.


Similar Jobs you may be interested in ..