Job Description :
Hi ,

Hope you are doing great. Please go through below requirement if you interested then please share your updated resume with contact details to .

Role : Big Data Engineer

Location : Seattle, WA

Duration : 6+ months

Visa: Any

Interview Mode: Phone and Face to Face

Note: Need Consultant with stronger AWS data migration experience using Terraform, Spark and Kafka. We are looking for someone who can be a part of a lift and shift AWS migration project moving from legacy system to a new production system.

What you''ll need:
5+ years of software engineering experience with Scala.
Will also consider Java, C# or Python experience.
Loves to write code, and a fast learner of new technologies
An understanding of high performance, scalable distributed systems (i.e., Apache Spark, Apache Mesos, Kubernetes, Hadoop/YARN)
Comfortable in a highly collaborative agile working environment

Experience managing and deploying cloud infrastructure
History of contributing to open source projects
Experience creating API''s in Scala using the Play framework
Experience working with and orchestrating containers (Docker)
Experience with kafka, elastic search, logstash, kibana and flume.

The Senior Software Engineer will deliver enterprise data solutions using the latest technologies. They will grow our data platform by establishing new data pipelines and will reveal key insights to our business stakeholders.
Who You Are:
You have a passion for the world of data. You are an avid technologist who is fascinated with distributed computing, machine learning, and data mining techniques. You leverage the latest tools to create innovative and cutting-edge data solutions. You strive to understand your customer''s needs and pride yourself on your speed of delivery. You flourish in a team environment, collaborating with other engineers and product development to understand business problems and drive technology solutions.
Your Next challenge:
Deliver enterprise data solutions using the latest technologies
Grow the data platform by establishing new data pipelines and reveal key insights to our business stakeholders
Focus on delivery with speed and quality
Assess the impact of your features to iterate and improve