Job Description :
Key Responsibilities

Minimum 5 years of experience in design and development of large-scale Information products and services with following technologies: Big Data Hadoop platform technologies, Java, Scala, RDBMS
Experience on Hadoop platform including Hadoop, Hive, HBase, AtScale, Spark, Sqoop, HDFS and related tools to build analytical applications
Experience with both Cloudera and Hortonworks Hadoop distributions is a plus
Strong working experience with Object Oriented Design and Development using Java is necessary
Solid understanding of principles and APIs of MapReduce, RDD, DataFrame and DataSets
Strong working experiences of implementing Big Data processing using MapReduce algorithms and Hadoop/Spark APIs
Experience building workflow to perform predictive analysis, muilti-dimensional analysis, data enrichments etc.
Experience of database fundamentals, RDBMS data modeling, NOSQL database and data modeling and programming
Knowledge of secure coding practices and framework is a plus
Experience in Agile methodologies such as Scrum.
Experience in supporting large enterprise applications
Always have an aptitude to learn new technologies and take on challenges
Bachelor or Master Degree in Computer Science, or other related technology discipline
Knowledge of Security Principles, and Accessibility Best Practices
             

Similar Jobs you may be interested in ..