States > California > San Jose > Hadoop

Hadoop jobs in California
San Jose, CA
($) : Market
Position:Bigdata Developer Location:San Jose, CA Duration:Long Term Required Skills: Experience with big data techniques (Hadoop, MapReduce, Spark) and querying tools (such as Hive, Pig Strong knowledge of cloud platforms (such as AWS) and extensive experience of developing applications on the cloud platforms using various cloud services Experience with relational (SQL) and NoSQl Databases, Apache Nifi Strong analytical and quantitative problem solving ability. Excellent
Jan-15-18
Apply
[Already Applied]
($) : Market
Job Title : Big data Developer Location: San Jose, CA Duration: 12+ Months Requisition Details: - Experience with big data techniques (Hadoop, MapReduce, Spark) and querying tools (such as Hive, Pig Strong knowledge of cloud platforms (such as AWS) and extensive experience of developing applications on the cloud platforms using various cloud services Experience with relational (SQL) and NoSQl Databases, Apache Nifi Strong analytical and quantitative problem solving abili
Jan-15-18
Apply
[Already Applied]
Hello, Greeting From Enshire Inc Hope You Doing Good. Role: Hadoop Developer Duration: Long Term Location: San Jose, CA Skills: 5+yrs of experience as a Hadoop Developer Familiarity with data loading tools like Flume, Sqoop Ability to write Map Reduce jobs (Java/Python/Perl etc Experience in writing Pig Latin scriptsHands on experience in HiveQL Knowledge of workflow/schedulers like Oozie Working knowledge in NoSQL DB (Mongo, Hbase, Casandra)
Jan-10-18
Apply
[Already Applied]
Please find job description. Role: Cassandra Spark Developer Location: San Jose, CA Duration: 12 months Mandatory Technical Skills CASSANDRA, SPARK,SCALA Desirable Technical Skills CASSANDRA, SPARK, REST Webservices Desirable Functional Skills needs to have good communication, Technical design, coding skills with self-motivated, learning team players qualities and work independently
Jan-10-18
Apply
[Already Applied]
San Jose, CA
($) : Market
Must Have Skills (Top 3 technical skills only) * 1. Hive, Spark ETLs 2. Map Reduce programming and performance tuning 3. Kafka Detailed Job Description: Good knowledge in Design, Development and testing. Top 3 responsibilities you would expect the Subcon to shoulder and execute*: 1. Work with the production owner on the user stories 2. Design and Development 3. Support User acceptance testing
Jan-05-18
Apply
[Already Applied]