Job Description :
Hi,

Pleasure mailing you. I came across your profile in job portal and wanted to touch base with your regarding one of our Global Implementation partner/Client opportunity.
Please go through the below requirement and let me know if you are comfortable for the position.
Please send me your updated resume along with the best hourly rate, visa status and availability.
An early response is really appreciated.

Job Title: Hadoop Administrator
Location :Fort Worth, TX
Long Term Contract

Job Responsibilities
Responsible for implementation and support of the Enterprise Hadoop environment.
Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration.
The administrator consultant will work closely with infrastructure, network, database, business intelligence and application teams to ensure business applications are highly available and performing within agreed on service levels.
Need to implement concepts of Hadoop eco system such as YARN, MapReduce, HDFS, HBase, Zookeeper, Pig and Hive.
In charge of installing, administering, and supporting Windows and Linux operating systems in an enterprise environment.
Accountable for storage, performance tuning and volume management of Hadoop clusters and MapReduce routines.
In command of setup, configuration and security for Hadoop clusters using Kerberos.
Monitor Hadoop cluster connectivity and performance.
Manage and analyze Hadoop log files.
File system management and monitoring.
Develop and document best practices
HDFS support and maintenance.
Setting up new Hadoop users.
Responsible for the new and existing administration of Hadoop infrastructure.
Include DBA Responsibilities like data modeling, design and implementation, software installation and configuration, database backup and recovery, database connectivity and security.
Possess good Linux and Hadoop System Administration skills, networking, shell scripting and familiarity with open source configuration management and deployment tools such as Puppet or Chef.
Built data platforms, pipelines, and storage systems using the Apache Kafka, Apache Storm and search technologies such as Elastic search.
             

Similar Jobs you may be interested in ..