Job Description :
Responsibilities
5+ years’ experience with full development lifecycle from inception through implementation
· 2+ years’ experience with building large scale big data applications
· Experience building Data Lake using Cloudera or Hortonworks distributions
· Hands-on experience in HDFS, MapReduce, Yarn & Hive
· Extensive experience in Spark leveraging Python, Scala or R.
· In depth knowledge of Java 8
· Experience working on 1 or more NoSQL Databases such as Cassandra, HBase, MongoDB, DynamoDB, Elastic Search
· Hands on experience with building CI/CD
· Experience with private cloud – PCF
· Experience in developing software solutions leveraging Test Driven Development (TDD)
· Expertise in Data governance and Data Quality
· Experience working with PCI Data is a plus
· Experience working with Data Scientists
· In depth knowledge of OO and SOLID design principles
· Demonstrable experience of successfully delivering big data projects using Kafka, Spark, Cassandra and related stack on premise or cloud
· Able to tune big data solutions to improve performance
· Excellent understand of Spring framework
· Experience with Oracle databases