Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Big Data Developer
Phoenix, AZ
Phoenix
AZ
85023
Date
: Oct-05-18
2018-10-05
2019-10-05
Big Data Developer
Phoenix, AZ
Oct-05-18
Work Authorization
US Citizen
GC
H1B
GC EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect
Rate/Salary ($)
:
open
Duration
:
6 months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
Third Party OK
CONTRACTOR
Certification Preferred
Required Skills
:
Hadoop, Big Data
Preferred Skills
:
MapReduce, HDFS, OOZIE, HIVE, SQOOP, PIG and SPARK
Domain
:
IT/Software, Financial
Work Authorization
US Citizen
GC
GC EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect
Rate/Salary ($)
:
open
Duration
:
6 months
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Consulting / Contract
Third Party OK
CONTRACTOR
Certification Preferred
Required Skills
:
Hadoop, Big Data
Preferred Skills
:
MapReduce, HDFS, OOZIE, HIVE, SQOOP, PIG and SPARK
Domain
:
IT/Software, Financial
Value Info Tech
Phoenix, AZ
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
8+ years of experience within IT and should have 4+ years of experience in Hadoop technologies in Linux environment.
Hands on experience in setting up, configuring Hadoop ecosystem components like Hadoop, MapReduce, HDFS, OOZIE, HIVE, SQOOP and PIG.
Sound knowledge in Hadoop Map-reduce, Hive, Spark
Extensive SQL experience
Experience in Hadoop patches and upgrades and troubleshooting hadoop job failures.
Experience in fixing the issues by interacting with dependent and support teams based on the priorities.
Experience in tuning the performance of the Hadoop ecosystem as well as monitoring.
Experience in analyzing Log files and finding the root cause and then involved in analyzing failures, identifying root causes and taking/recommending course of actions.
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
GCP Data Engineer
,
Phoenix, AZ
Mar-30-24
Technogen, Inc
($) :
USD 72 / Hourly / C2
GCP Data Engineer Phoenix AZ Must Have Qualifications: Bachelors degree in Engineering or Computer Science or equivalent OR Masters in Computer Applications or equivalent. 5+ years of software development experience and leading teams of engineers and scrum teams 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark) Hands-on experience on writing and understanding complex SQL(Hive/PySpark-dataframes), optimizing joins while processing huge amount of
Apply
[Already Applied]
[Apply Individually]