Job Description :
Qualifications:
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 4 years of experience with Information Technology

Preferred:
At least 3 years of experience in software development life cycle.
At least 3 years of experience in Project life cycle activities on development and maintenance projects.
At least 2 years of experience in Hadoop ecosystem, i.e. Hadoop, Hbase, Hive, Scala, SPARK, Sqoop, Flume, Kafka and Python
Experience in Design and architecture review a +.
Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture
Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Perl, and JavaScript
Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. Knowledge of large data sets and experience with performance tuning and troubleshooting
Hands-on development mentality, with a willingness to troubleshoot and solve complex problems
High degree of competency in Python, Java programing
CI / CD exposure
Strong written and oral communication skills
Ability to work in team in diverse/ multiple stakeholder environment
Analytical skills
Experience and desire to work in a Global delivery environment