Job Description :
JOB DESCRIPTION:
Role: Hadoop Admin
Duration: 12 months contract
Location: Sponsor is open to candidates local to Littleton, CO or Monroe, LA. The Sponsor sits in Monroe.

Interview Process. Candidate Review and Interview Process: Sponsor plans to make a hiring decision after two or three interviews: first 1:1 interview with the sponsor, second interview with sponsor and a member of the team, and a final panel interview. The sponsor is open to phone or in-person interviews, but would prefer at least one of the interviews to be in-person.

Description:
Bachelor’s Degree Required, Master’s Degree Preferred; 8+ years of Experience Required; Assignment may include some limited travel (10% or less) to other CTL regions/facilities;
Schedule: First Shift (Days) Monday through Friday, though some after hours and weekends may be required with advanced notice; The team shares an On-Call Rotation.


Primary Job Responsibilities:
- Responsible for researching, designing, building, testing, deploying, analyzing, administering and maintaining Hadoop environments and associated Hadoop hardware and software technology components to meet current and future business needs.
- Monitoring and controlling the performance and status of technology components, and providing technology component support and problem resolution.
- Direct the security work of more junior engineers serving as Subject Matter Expert for security requirements within areas of responsibility. - - Utilize proven systems, scripting and developer skills to execute on highly complex tasks related to hardware and software technology component analysis, integration, and incident and problem resolution.
- Manage efforts to test, debug, support, and analyze performance, and document hardware and software technology components.

Required Skills, Systems, Experience:
- Bachelor’s Degree in Computer Science, Information Systems, Business, or a related field
- 8+ years of relevant industry experience
- 3+ years of experience with Big data, Hadoop and Hadoop-ecosystem.
- Cloudera Experience
- Deployment and Administration for different Hadoop distribution for leading organizations, which deal with strong enterprise-scale solutions architecture and implementation.
- Experienced in anticipating problems and taking decisive action, to solve the issues without to the impact on both development clusters and production cluster.
- Implementation and ongoing administration of Hadoop infrastructure.
- Setup, configuration and management of security for Hadoop clusters using Kerberos and integration with LDAP/AD at an Enterprise level.
- Working with data delivery teams to setup new Hadoop users.
- Setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
- Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools.
- Screen Hadoop cluster job performances and capacity planning.
- Worked on Monitor Hadoop cluster connectivity and security.
- Managing and reviewing Hadoop log files.
- File system management and monitoring.
- HDFS support and maintenance.
- Worked on Database backup and recovery.

Preferred Skills, Systems, Experience: Master’s Degree in Computer Science, Information Systems, Business, or a related field

Soft Skills: Excellent oral and written communication required; candidates will be expected to collaborate and dialog with members of the team
             

Similar Jobs you may be interested in ..