Job Description :
Qualifications

Bachelor’s degree or foreign equivalent required. Will also consider three years of relevant work experience in lieu of every year of education

At least 4 years of Design and development experience in Java/Core Java related technologies

At least 1 year of hands on design and development experience on Big data related technologies – Hadoop, PIG, Hive, MapReduce

Should be a strong communicator and be able to work independently with minimum involvement from client SMEs

Preferred Skills:

c

Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.

Must have strong programming knowledge of Core Java or Scala - Objects & Classes, Data Types, Arrays and String Operations, Operators, Control Flow Statements, Inheritance and Interfaces, Exception Handling, Serialization, Collections, Reading and Writing Files.

Must have hands on experience in design, implementation, and build of applications or solutions using Core Java/Scala.

Strong understanding of Hadoop fundamentals.

Strong understanding of RDBMS concepts and must have good knowledge of writing SQL and interacting with RDBMS and NoSQL database - HBase programmatically.

Strong understanding of File Formats – Parquet, Hadoop File formats.

Proficient with application build and continuous integration tools – Maven, SBT, Jenkins, SVN, Git.

Experience in working on Agile and Rally tool is a plus.

Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Python, Perl, and JavaScript.

Should have worked on large data sets and experience with performance tuning and troubleshooting.

Knowledge of Java Beans, Annotations, Logging (log4j), and Generics is a plus.

Knowledge of Design Patterns - Java and/or GOF is a plus.

Knowledge of Spark, Spark Streaming, Spark SQL, and Kafka is a plus.

Experience to Financial domain is preferred

Experience and desire to work in a Global delivery environment