Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Webinars
Job Fairs
Events
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Big data engineer
Durham, NC
Durham
NC
27717
Date
: Apr-30-20
2020-04-30
2021-04-30
Big data engineer
Durham, NC
Apr-30-20
Work Authorization
US Citizen
GC
H1B
GC EAD, L2 EAD, H4 EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect
Rate/Salary ($)
:
$120k
Duration
:
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Permanent Direct Hire
FULL_TIME
Direct Client Requirement
Required Skills
:
Hadoop, Agile, Big Data, Business Intelligence, Data Scientist, Data Warehousing, DevOps, Informatica, JAVA, Kanban, Oracle, Performance Tuning, SCRUM
Preferred Skills
:
Domain
:
Work Authorization
US Citizen
GC
GC EAD, L2 EAD, H4 EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect
Rate/Salary ($)
:
$120k
Duration
:
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Permanent Direct Hire
FULL_TIME
Direct Client Requirement
Required Skills
:
Hadoop, Agile, Big Data, Business Intelligence, Data Scientist, Data Warehousing, DevOps, Informatica, JAVA, Kanban, Oracle, Performance Tuning, SCRUM
Preferred Skills
:
Domain
:
USM Systems
Chantilly, VA
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Fulltime
Open for USC and GC , H1
Job Description:
We are looking for a Developer to be a core member of a Data Science Emerging Research Team working closely with Data Scientists to deliver outstanding business intelligence applications!
Education and Experience
Bachelor’s degree or higher in a technology related field (e.g. Engineering, Computer Science, etc required, Master’s degree a plus
10+ years of hands-on experience in architecting, crafting and developing highly scalable distributed data processing systems
3+ years of experience in Hadoop ecosystem (Spark, Hive, Sqoop, etc and experience in one or more modern Object-Oriented Programming languages (Java, Scala, Python)
Proficient in Oracle, pl/sq. and performance tuning; Hands on experience in Informatica
Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash) and must have executed projects in Agile environments (Kanban and Scrum
You enjoy analyzing data, identifying gaps, issues, patterns and trends and can analyze application dependencies and conduct impact assessment of changes
You have a good understanding of database design concepts – Transactional / Data mart / Data warehouse etc.
Experience with cloudera distributed version of Hadoop; Experience with Snowflake will be a plus; Experience with shell scripting, control M and willing to participate on weekend on call rotation
Client :
investment
Turn OFF keyword highlights