Job Description :

Data Engineer

Remote (Open to Remote, needs to be CST)

We have 2-3 positions. High level description of skillset needed:

  • Strong experience with either Spark or Java or Scala
  • Experience with Kafka
  • Experience with cloud platform (GCP or Azure)
  • Experience with No SQL database Cassandra or similar

Main skillset: Spark and Java > Data Engineer to perform Spark job

  • Familiar with Kafka- using for target pipeline
  • Any Cloud: GCP or Azure they use both can have AWS
  • Database- any noSQL- Cassandra is what they use but any other noSQL will be fine
  • 6 month

Hybrid right now- open to remote in the US > must CT

Project Description: EDF- event driven fulfillment

  • New features
  • Business initiative- working product with expansion, new capabilities, new ventures

Background

  • Background does not matter- as long as they have skillset

Interview Process: send candidates directly

  • 1 hour interview
  • Hands on coding + spark and java technical questions + problem solving question
Mostly going through thought process and how they can come up with that solutions and go through pros and cons of that solution
Requirement: Bachelor's degree or associate in applied science specific to Computer Science or minimum of 5 years of related work experience; or an equivalent combination of education and work experience
             

Similar Jobs you may be interested in ..