Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Hadoop Developer
Charlotte, NC
Charlotte
NC
28299
Date
: Feb-19-18
2018-02-19
2018-02-24
Hadoop Developer
Charlotte, NC
Feb-19-18
Work Authorization
US Citizen
GC
H1B
EAD (OPT/CPT/GC/H4)
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect, Senior, Midlevel, Junior
Rate/Salary ($)
:
Market
Duration
:
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Permanent Direct Hire
FULL_TIME
Direct Client Requirement
Required Skills
:
Hadoop, Java, Map reduce, Hive, Pig, Hbase, Sqoop
Preferred Skills
:
Domain
:
IT/Software, Financial, Government, HealthCare, Retail, Dot Com, Insurance, Pharmaceuticals, Manufacturing, Telecom
Work Authorization
US Citizen
GC
EAD (OPT/CPT/GC/H4)
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect, Senior, Midlevel, Junior
Rate/Salary ($)
:
Market
Duration
:
Sp. Area
:
BigData, NoSQL
Sp. Skills
:
Hadoop
Permanent Direct Hire
FULL_TIME
Direct Client Requirement
Required Skills
:
Hadoop, Java, Map reduce, Hive, Pig, Hbase, Sqoop
Preferred Skills
:
Domain
:
IT/Software, Financial, Government, HealthCare, Retail, Dot Com, Insurance, Pharmaceuticals, Manufacturing, Telecom
Avance Consulting
Somerset, NJ
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Job Title : Big data Hadoop Developer
Location : Jersey City/ Pennington, NJ, Charlotte, NC
Duration : Full/Permanent
Qualifications
Basic
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 7 years of experience with IT Bigdata Skills
Preferred
At least 4 years of experience in technology consulting, enterprise and solutions architecture and architectural frameworks
At least 2 years of experience in Hadoop, Hive, Hbase Skills
At least 3 years of experience in project execution
Experience in defining new architectures and ability to drive an independent project from an architectural stand point
Analytical skills
At least 2 years of experience in thought leadership, white papers and leadership/mentoring of staff and internal consulting teams
Client :
Avance Consulting
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Python Developer - Pyspark
,
Charlotte, NC
Apr-25-24
APN Consulting Inc
($) :
BASED ON EXPERIENCE
APN Consulting has an immediate need for a direct client requirement: Python Developer - Pyspark Location - Charlotte, NC (Hybrid) Long-term Contract Key Responsibilities: She was involved in data manipulation using Python scripts Spark Scala which will be useful for faster data processing. Created Parameterized Queries generated Tabular reports Sub reports Cross Tabs Drill down reports using Expressions Functions Charts Maps Sorting the data Defining Data sources and Sub
Apply
[Apply Individually]
Sr.
Java
developer - Financial services.
,
Charlotte, NC
Apr-25-24
Accord Tecnologies.Inc
($) :
Market
Sr
Java
Developer- financial servicesLocation: Charlotte, NC/Minneapolis, Minnesota (Hybrid, 3 days onsite mandatory)Duration: long-term contract.Position type: W2 contract. Client requesting In-person Interview inCharlotte, NC/Minneapolis, Minnesota so only locals /who can come in person with own expenses should apply Required Qualifications : 8-10 years of
Java
8, Spring Boot, Cloud, API model, Token model development experience At least 5-7 full-scale development projects wi
Apply
[Apply Individually]
DevOps Engineer with
Java
Coding Experie
,
Charlotte, NC
Apr-05-24
Technogen, Inc
($) :
USD 50 / Hourly / C2
Job Title : DevOps Engineer with
Java
Coding Experience Location : Charlotte, NC (Onsite) Duration : 12+ months Desired: A BS/BA degree or higher in science or technology (Computer Science Engineering Discipline preferred) Enterprise experience with one or more CI/CD implementation, configuration management and orchestration tools: Jenkins, Maven, Gradle, GitHub Actions, Terraform Understanding of Pipeline as code and Infrastructure as 1+ year of experience in one or a combination of the fo
Apply
[Already Applied]
[Apply Individually]
Remote - Full Stack Web Developer
,
Charlotte, NC
Apr-05-24
Technogen, Inc
($) :
USD 75 / Hourly / C2
itle: Sr. full stack web application developerLocation: Charlotte, NC (Hybrid)Must have experience with Typescript, Javascript, angular, react, NodeJS
Apply
[Already Applied]
[Apply Individually]
Full Stack Developer / ReactJS
,
Charlotte, NC
Apr-05-24
Technogen, Inc
($) :
Market
Full Stack Developer w/ ReactJS Pay Rate: Market /Flexible Location: Charlotte Only (Hybrid 3 days a week, Tues to Thurs Will look at remote for an absolute rockstar. Contract Length: Through 1/5/2025, possibility of extension Client: Banking Top 3 requirements: Capital Markets Experience TypeScript/HTML/CS React expertise (Next.js is preferred) State Management and Unit Testing
Java
/Spring Boot Previous Banking Experience working with Business Analysts and creating automated processes. Stro
Apply
[Already Applied]
[Apply Individually]
Data Engineer
,
Charlotte, NC
Mar-29-24
Technogen, Inc
($) :
USD 80 / Hourly / C2
Role: Data Engineer Location: Charlotte, NC LPL Financial Job Responsibilities: Work with development teams and other project leaders/stakeholders to provide technical solutions that enable business capabilities Design and develop data applications using
big data
technologies (AWS, Spark) to ingest, process, and analyze large disparate datasets Build robust data pipelines on the Cloud using AWS Glue, Aurora Postgres, EKS, Redshift, PySpark, Lambda, and Snowflake. Build Rest-
Apply
[Already Applied]
[Apply Individually]