Job Description :
The Senior Hadoop/AWS Admin will be responsible for architecting, designing, and leading the development team to implement mission critical data processing Hadoop platform at Our Client in an AWS environment. The position will work with executive leadership to provide technical guidance, enterprise architecture, data strategies and implement short and long term goals for the company. The director will also supervise his/her direct reports to ensure their direction and goals are in concert with the objectives of IT/Ops department in the company.
Responsibilities
Design, configure, implement and manage the Hadoop HBase platform that processes Point of Sale data with optimal performance and ease of maintenance.
Coordinate efforts with IT and Operations to ensure that the Hadoop architecture, data structures and processes meet the business requirements and objectives.
Provide technical leadership, as a hands-on technical manager, who can lead, manage and coach other team members and delegate effectively.
Troubleshoot and resolve various process or data related issues. Will be on call and provide off-hour support as needed.
Assist in the ongoing development and documentation of the standards for the system and data processes.
Create project plans, manage milestones, create and distribute reports and manage risks
Communicate effectively with senior management, direct reports and customers.
Plan system capacity and design advanced features for future business growth and new business opportunities.
Design, implement and maintain all AWS infrastructure and services within a managed service environment.
Design, Deploy and maintain enterprise class security, network and systems management applications within an AWS environment.
Design and implement availability, scalability, and performance plans for the AWS managed service environment.
Continual re-evaluation of existing stack and infrastructure to maintain optimal performance, availability and security.
Implement process and quality improvements through task automation. Institute infrastructure as code, security automation and automation or routine maintenance tasks.
Perform data migration from on premises environments into AWS
Support the business development lifecycle (Business Development, Capture, Solution Architect, Pricing and Proposal Development)
Qualifications:
Expertise with Hadoop HBase, NoSQL database and HIVE designs and implementations. Any experience with other applications in Hadoop Ecosystem is a big plus
8+ years of experience designing and implementing ETL or EDW processes utilizing tools such as Ab Initio, Hadoop or SAS.
8+ years of experience working in an Oracle, SQL Server, or Sybase environment as an architect/developer, experienced in SQL and performance tuning
Proficient in data modeling and database design concepts
Strong experience with AWS (AWS Cloud Formation, AWS EC2, S3, VPC, etc
Strong knowledge of Amazon Kinesis, AWS Lambda, Amazon Simple Queue Service (Amazon SQS), Amazon Simple Notification Service (Amazon SNS), and Amazon Simple Workflow Service (Amazon SWF)
Experience with automation/configuration management using Puppet, Chef, Ansible or similar.
8+ years of experience working on UNIX platform, expertise in shell scripting and performance monitoring on UNIX servers.
Experience with large data volume and data performance optimization.
Proven leadership ability
Excellent analytical/problem solving and troubleshooting skills.
Excellent verbal and written communication, presentation and interpersonal skills
Bachelor or Master’s degree in Computer Science, Math, Physics or Engineering
             

Similar Jobs you may be interested in ..