Job Description :

Job Title: Spark Scala Developer/Data Engineer
Location: Bloomfield, Connecticut
Long Term

Responsibilities:

  • This Spark Scala Engineer will be a key contributor to our Services practice and will have the below responsibilities:
  • Work with product owners to understand desired application capabilities and testing scenarios
  • Continuously improve software engineering practices
  • Work within and across Agile teams to design, develop, test, implement, and support technical solutions across a full-stack of development tools and technologies
  • Write unit tests and automated acceptance tests.
  • If you are looking for a new challenge and want to make a difference in the Healthcare Industry, this role is for you.
  • Work on a variety of development projects on a Spark Scala platform and provide architecture guidance
  • Work with business stakeholders and other SMEs to understand high-level business requirements
  • Work with the Solution Designers and contribute to the development of project plans by participating in the scoping and estimating of proposed projects
  • Apply technical background/understanding, business knowledge, system knowledge in the elicitation of Systems Requirements for projects
  • Support other team members in translating requirements and use cases into test conditions and expected results for product, performance, user acceptance, and operational acceptance testing; participate in automation testing of developed systems/solutions
  • Adhere to existing processes/standards including the project development lifecycle, business technology architecture, risk and production capacity guidelines and escalate issues as required
  • Prioritize and manage own workload in order to deliver quality results and meet timelines
  • Support a positive work environment that promote service to business, quality, innovation and teamwork and ensure timely communication of issues/ points of interest

Requirements / Qualifications:

  • Good Scala skills in experience with Spark jobs
  • Spark Framework knowledge and info to code Scala with Databricks use knowledge
  • Good SQL skills
  • Good Data modelling and data transformation expertise including performance of the transformation
  • Nice to have 2+ years of experience with Cloud technologies , AWS experience is preferable.
  • Nice to have 2+ years in Test Driven Development (TDD), and/or Continuous Integration/Continuous Deployment (CI/CD) is a plus
  • Nice to have experience in Build Tools such as Maven and SBT

Skill Set Years of Experience Proficiency Level

Spark Scala 3

Skill Set Years of Experience
Scala/Python
Spark
AWS
Data Bricks

             

Similar Jobs you may be interested in ..