Job Description :

Role:- Senior Big Data Engineer  
Work location:-  Texas/NY/Boston, MA
Contract duration:- 12 Months

Client:- Capgemini

Performs research, design, implementation, and support tasks as a member of team. Works in accordance with project guidelines, quality standards and code conventions.
• Responsible for area/areas within the team area of responsibility (AOR). One of the current team AOR is improving Big Data Platform used by one of the world\'s largest social media platform which deals with few petabytes of data coming to the system daily.
• Investigate, create, and implement the solutions for existing technical challenges, including building/enhancing the frameworks and tools used by other development teams.

• Obtains tasks from the project lead or Team Lead (TL), prepares functional and design specifications, approves them with all stakeholders.
• Ensures that assigned area/areas are delivered within set deadlines and required quality objectives.
• Provides estimations, agrees task duration with the manager and contributes to project plan of assigned area.
• Analyzes scope of alternative solutions and makes decision about area implementation based on his/her experience and technical expertise.
• Leads functional and architectural design of assigned areas. Makes sure design decisions on the project meet architectural and design requirements.
• Addresses area-level risks, provides and implements mitigation plan.
• Reports about area readiness/quality, and raises red flags in crisis situations which are beyond his/her AOR.
• Responsible for resolving crisis situations within his/her AOR.
• Initiates and conducts code reviews, creates code standards, conventions and guidelines.
• Suggests technical and functional improvements to add value to the product;
• Constantly improves his/her professional level.
• Supervises and coaches newcomers and more junior team members.
• Collaborates with other teams.
• If required, make yourselves available for the visits to the client location.




Must have:
• University degree in Computer Related Sciences or similar

5+ years of commercial development experience including Python
• 2+ years of experience in one or more open-source Big Data technologies (Hadoop, Spark, Hive, Presto)
• ETL or Pipeline design/implementation with Large Distributed Databases
• Experience with big data workflow orchestration engines for ETL jobs, such as Airflow

• Rigor in high code quality, automated testing, and other engineering best practices
Strong OOP skills
• Strong communication, collaboration and interpersonal skills
• Result oriented approach
• Good English (oral & written) and communication skills in general
Would be a plus:
• Experience with AWS
• Development experience of highly loaded and distributed systems using Java


Similar Jobs you may be interested in ..