Job Description :
Experience with full development lifecycle from inception through implementation
Experience with building large scale big data applications
Experience building Data Lake using Cloudera or Hortonworks distributions
Hands-on experience in HDFS, MapReduce, Yarn & Hive
Extensive experience in Spark leveraging Python, Scala or R.
In depth knowledge of Java 8 and hands-on Java knowledge is required
Experience working on 1 or more NoSQL Databases such as Cassandra, HBase, MongoDB, DynamoDB, Elastic Search
Hands on experience with building CI/CD
Experience with private cloud – PCF
Experience in developing software solutions leveraging Test Driven Development (TDD)
Expertise in Data governance and Data Quality
Experience working with PCI Data is a plus
Experience working with Data Scientists
Demonstrable experience of successfully delivering big data projects using Kafka, Spark, Cassandra and related stack on premise or cloud
Able to tune big data solutions to improve performance
Excellent understanding of Spring framework
Experience with Oracle databases