Job Description :
We are looking for a Big Data DevOps Engineer who will be responsible for managing our big data infrastructure based proprietary trading systems, risk management systems and portfolio management reporting applications. The DevOps Engineer will be responsible for building enterprise monitoring solutions, implementing configuration management and runbook automation, standardizing processes and managing big data applications and underlying infrastructure to help steer scalability and stability improvements early in the lifecycle of development while ensuring operational best practices are supported. The ideal candidate is an energetic self-starter and a team player having a passion for software engineering and building automation,who works well under pressure and has strong communication and technical skills.

POSITION REQUIREMENTS

Minimum Bachelor’s degree in computer science or a related field.
Minimum of 3 years’ experience on Big Data technologies with expertize in HDFS, Yarn, Spark, Hive/Impala, Kafka, Oozie.
Minimum of 4 years of DevOps or development and operations combined experience
Hands-on experience managing distributed systems and clusters
Expertise in scripting technologies like Python
Understanding of C++/Java and microservice/SOA architecture
Data Mining and Machine Learning experience with excellent quantitative analytics skills is a bonus
Excellent communications skills, possess strong problem solving, analytical, time management skills.
Experience analyzing and resolving performance, scalability and reliability issues.
             

Similar Jobs you may be interested in ..