Job Description :
Qualifications Basic
Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education
At least 6+ years of experience working in Big Data Technologies
At least 11 years of experience in Data warehousing
At least 4+ years of experience in core Java and its ecosystem
At least 3+ experience on Oracle big data stack (Cloudera, ODI, Fusion,etc)

At least 6+ years application Big Data project implementation experience including requirements gathering, architecture, functional and technical design
At least 3 years of experience in leading technical teams and Client-facing Skills
Experience in understanding new architectures and ability to drive an independent project from an architectural stand point
Analytical and problem solving skills
Ability to learn new technologies and business requirements
Strong communication skills, both written and oral; Ability to build and deliver presentations to all levels of the business and effectively explain complex issues and concepts in simple, understandable language
Extensive experience in Solution Architecture, Design and Development for one or more projects
Should be well versed with Big Data Landscape, Data warehousing and Business Intelligence
Should have hands-on experience in any of the big data technologies like Map reduce, Hadoop, Hive, Spark, Scala, Flume, Sqoop, Hbase etc. to complete POCs, review code and guide the team
Should have had experience in playing the role of a lead architect for at least 2-3 Big Data implementations
Should have in-depth experience in implementing Big data platforms on one of the leading distributions (Cloudera, Hortonworks or MapR)
Candidate should also have good knowledge of Big Data architecture patterns, design patterns, estimation techniques, performance tuning and trouble shooting

Client : Confidential