Job Description :
Mandatory Skills:
Java, Spring, Hadoop - Pig, Hive, MR, Python, Scala

Software Engineering SpecialisT:
Responsible for designs, develops, modifies, debugs and/or maintains software systems, Serves as an expert on specific modules, applications or technologies, and deals with complex assignments during the software development life cycle.

What will your job look like:
You will take ownership and accountability of specific modules within an application and provides technical support and guidance during solution design for new requirements, problem resolution for critical / complex issues
You will Ensures code is maintainable, scalable and supportable.
You will present demos of the software products to stakeholders and internal/external customers, using knowledge of the product/solution and technologies to influence the direction and evolution of the product/solution.
You will investigate issues by reviewing/debugging code and providing fixes (analyzes and fixes bugs) and workarounds, will review changes for operability to maintain existing software solutions, will highlight risks and will help mitigate risks from technical aspects.
You will bring continuous improvements/efficiencies to the software or business processes by utilizing software engineering tools and various innovative techniques, and reusing existing solutions. By means of automation, reduces design complexity, reduces time to response, and simplifies the client/end-user experience.
You will represent/lead discussions related to product/application/modules/team (for example, leads technical design reviews Builds relationships with internal customers/stakeholders

All you need is:
Bachelor''s degree in Science/IT/Computing or equivalent.
5-7+ years total experience in development mainly around Java and all related technologies in the Java stack (e.g. Spring)
3+ years in-depth knowledge & experience in Hadoop around all the Hadoop ecosystem tools like M/R, Hive, Pig.
Expertise in running Spark 2.0 applications developed in Scala or Python.
Good understanding & experience with Performance tuning in Hadoop environment for complex S/W projects mainly around large scale and low latency.
Strong experience in Shell scripting and SQL based Data Analytical skills.
Hadoop/Spark/Java certifications is an advantage
Experience with AWS, Elasticsearch and Logstash is a plus.
Excellent written and verbal communication – to communicate with Development and Project Management Leadership
Excellent collaboration and team work skills to work within Client and other 3rd party vendors.
             

Similar Jobs you may be interested in ..