Job Description :
Bigdata Architect

San Francisco CA



" Minimum 7-9 years of total experience & 2-3 years in Hadoop

Thorough understanding of Hadoop Cloudera/HortonWorks/MapR and ecosystem components

Thorough understanding of No SQL databases like HBase, Mongo, Cassandra etc

Requirements Gathering, designing and developing scalable big data solutions with Hadoop

Strong technical skills on Spark, HBase, Hive, Sqoop, Oozie, Flume, Java, Pig, Python etc

Good experience with distributed systems, large scale non-relational data stores, map-reduce systems, performance tuning, and multi-terabyte data warehouses

Able to work independently and mentor team members

Hands on development experience in Hadoop

Effective communication