Job Description :
Role: Big Data/Hadoop Engineer
Location: Columbus OH
Duration: 6 Months

Skills:
Strong programming knowledge in Java, Kafka, spark, scala, python, SQL.
Strong working experience in Big data/Hadoop ecosystem HBase, HDFS, Hive, MapReduce (Horton Works or Cloudera) etc.
Role and Responsibility:
Hand-on on Java with Hadoop are Mandatory and experience on ETL Modeling.
Masters or Bachelors in Computer Science with 3+ Years’ Experience in building applications in Java.
Strong CS fundamentals, data structures, algorithms with good understanding of Object-Oriented Design Principles, architecture and prevalent design patterns Strong in Object Oriented Development and Java platform.
Hands on experience in big data technologies including Storm or Spark, Hadoop, Kafka to name a few.
Experience with big data technologies is a must.
Excellent communication skills are a must for this position
Financial domain experienced profiles are highly preferred but not mandatory
             

Similar Jobs you may be interested in ..