Job Description :
Required Skills:
Management of Hadoop cluster, with all included services
Proficiency with Hadoop v2, MapReduce, HDFS
Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
Good knowledge of Big Data querying tools, such as Pig, Hive
Experience with Spark and Nifi
Experience with integration of data from multiple data sources
Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
Knowledge of various ETL techniques and frameworks, such as Flume
Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O
Experience with Cloudera/MapR/Hortonworks
             

Similar Jobs you may be interested in ..