Job Description :

Kindly share resumes at or call on

Job description:

Job Responsibilities


Translate complex business requirements into a scalable and highly efficient data platform.

Own several Data pipelines end to end.

Work with business teams/engineers to define instrumentation and data requirements.

Need to have strong analytical and problem-solving skills.

Work closely with Engineering and DevOps team to analyze and resolve capacity, scaling and performance issues

Collaborate with multiple cross functional teams and work on solutions that impact business.

Lead projects independently and also mentoring other offshore engineers in the team.

Prototype ideas quickly using cutting edge and new generation technologies

Create and Maintain Documentation

Qualifications And Requirements

Bachelor’s or Master’s degree or equivalent in computer science or related field with minimum of 8 years of directly related work experience.

Hands On Programming/scripting experience in Java, Scala or Python

Hands On Experience with big data technologies like Hadoop, Map Reduce, Pig, Hive, Spark

Experience with Unix/ Linux and shell scripting

Experience on Cloud Platforms like AWS, Google Clouds or Azure. Good Cloud preferred

Knowledge and experience working with various data sources like unstructured data files, flat files, message queues, xml based events, databases.

Excellent verbal and written communication skills.

Leads design and code reviews;

Drives architecture discussions, and proposes solutions to system and product changes.

Motivated and interested delivering results, especially in the area of writing high-performance, reliable and maintainable code.

Ability to adapt to new development environments, changing business requirements and learning new systems highly desired.

Good team player, able to effectively work across multiple teams on solutions that have complex dependencies and requirements in a fast-paced environment

Kindly share resumes at or call on