Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Hadoop Admin
Charlotte, NC
Charlotte
NC
28299
Date
: May-22-19
2019-05-22
2019-05-23
Hadoop Admin
Charlotte, NC
May-22-19
Work Authorization
US Citizen
GC
H1B
GC EAD, L2 EAD, H4 EAD, TN EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect
Rate/Salary ($)
:
DOE
Duration
:
Long Term
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
Third Party OK
CONTRACTOR
Direct Client Requirement
Required Skills
:
Hadoop, Linux, Big Data, JAVA, hadoop admin, shell scripting, perl scripting
Preferred Skills
:
Domain
:
IT/Software, Financial
Work Authorization
US Citizen
GC
GC EAD, L2 EAD, H4 EAD, TN EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect
Rate/Salary ($)
:
DOE
Duration
:
Long Term
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
Third Party OK
CONTRACTOR
Direct Client Requirement
Required Skills
:
Hadoop, Linux, Big Data, JAVA, hadoop admin, shell scripting, perl scripting
Preferred Skills
:
Domain
:
IT/Software, Financial
EUCLID INNOVATIONS INC
Charlotte, NC
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Job Description / Responsibilities
This is a production support HADOOP administrator role. We are looking for a highly competent and highly motivated individual in this role, as such it will require a commitment on your part as well as Wells Fargo Securities team.
Owning, tracking and resolving HADOOP related incidents and requests.
Fulfilling requests and resolving incidents within SLAs.
Reviewing service related reports (e.g.: HADOOP configuration, maintenance, monitoring) on a daily basis to ensure service related issues are identified and resolved within established SLAs.
Responding to HADOOP related alerts and escalations and working with database engineering to come up with strategic solutions to recurring problems.
Manage scalable Hadoop cluster environments.
Manage the backup and disaster recovery for Hadoop data.
Optimize and tune the Hadoop environments to meet performance requirements.
Install and configure monitoring tools.
Work with big data developers and developers designing scalable supportable infrastructure.
Work with Linux server admin team in administering the server hardware and operating system
Assist with develop and maintain the system runbooks.
Create and publish various production metrics including system performance and reliability information to systems owners and management.
Perform ongoing capacity management forecasts including timing and budget considerations.
Coordinate root cause analysis (RCA) efforts to minimize future system issues.
Mentor, develop and train junior staff members as needed.
Provide off hours support on a rotational basis.
Basic Qualifications
Job seekers must meet Basic and Minimum Qualifications to be considered qualified and eligible applicants.
Minimum Qualifications
We are looking for a person that has:
Experience with backups, restores and recovery models.
Experience working with Linux/sun servers.
Demonstrated experience with database concepts (e.g., normalization, referential integrity, modeling
Demonstrated ability to identify problems, analyze possible solutions, and determine best course of action to resolve.
Ability to multi-task.
BS Degree in Computer Science/Engineering required.
4+ years of IT experience
3-4 years overall experience in Linux systems.
1-2 years of experience in deploying and administering Hadoop Clusters
Well versed in installing & managing distributions of Hadoop (MapR, Cloudera etc
Knowledge in performance troubleshooting and tuning Hadoop Clusters
Good knowledge of Hadoop cluster connectivity and security.
Development experience in JAVA, Hive, PIG, Sqoop, Flume, and HBASE desired
Excellent customer service attitude, communication skills (written and verbal), and interpersonal skills.
Experience working in cross-functional, multi-location teams.
Excellent analytical and problem-solving skills. strong inter-personal skills.
Excellent written and verbal communication skills.
BS Degree in Computer Science/Engineering required.
Client :
Banking client
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Actimize Developer
,
Charlotte, NC
Jun-16-25
Intone Networks
($) :
Market
Actimize Developer - 45507-1Location Charlotte, North Carolina - onsite daily Note from HM: candidates who have also worked at TCS, Infosys or Wipro would be ideal Must Have s:Extensive experience in NICE Actimize IFM-XDesign and implement custom models within AIS (Actimize Intelligence Server) and RCM (Risk Case Manager).Hands On Experience in Actimize solution version upgrade and Service pack deployment. Roles and Responsibilities (8-12 yrs):Extensive experience in NICE Actimize IFM-X
Apply
[Apply Individually]
Performance Engineer w/Database Tuning
,
Charlotte, NC
Jun-30-25
Intone Networks
($) :
USD 60 / Hourly / C2
Role: Performance Engineer w/Database Tuning Location: Onsite in Austin, TX or Sunnyvale, CA Required Qualifications: Minimum of 5 years of proven hands-on experience in Performance engineering with some exposure to SAP or similar packaged products/DBs Strong experience in DB Performance tuning and engineering. In-depth knowledge of performance-tuning SQL queries and stored procedures Experience in doing in-depth Root Cause Analysis to identify performance bottlenecks Prof
Apply
[Apply Individually]