Job Description :
For one of our ongoing projects we are looking for Software Engineer - Hadoop & Big Data

Qualifications:
6 to 8 years of experience in IT and 4 to 5 years on Big data with strong knowledge in Hadoop, Spark, Java.
2 -3 years of experience developing RESTful services using Java, Spring and Spring Boot
Hands on experience in utilities such as Sqoop, Hive, HBase, Pig, Impala.
Strong experience in Data ingestion and Big data.
Experience in developing MapReduce jobs in Hadoop for various data processing.
Data Integration using Apache Spark in any of the programming language (Java\Python preferred)
Experience in real-time data processing using Kafka
Candidate should have good communication skills and should have performed client facing roles.

Responsibilities:
Development of complex code to build business rules to load data into target systems using various Big data components.
Recommend design alternatives for data ingestion from variety of sources into different types of target databases.
Translate functional requirements into technical design.
Development and maintenance of Big data integration jobs.
Design data integration jobs.
Create Big data jobs to migrate data from heterogeneous sources such as SQL Server, Oracle, Teradata databases etc.