Job Description :
Responsibilities
Participate as member of an engineering team focused on end-to-end delivery of customer-focused solutions using Agile and Lean methodologies
Definine the architecture and framework within an enterprise for Big Data including structured data, non-structured data, real-time data and analytics.
Extend the industry-recognized best practices within the Enterprise Data Architecture to include Hadoop
Build, test, deploy, and monitor solutions focused on Hadoop related technologies
Define, review, and collaborate on technical architecture of solutions
Collaborate daily with project stakeholders including IT operations and infrastructure staff, product owner(s), end users, and other business leadership
Contribute forward-thinking innovation and ideas to projects that introduce new technology and processes to increase business value
Mentor other engineers and team in best practices
Occasional travel possible


Requirements
7- 10 years hands of relevant development experience
2+ years'' experience building solutions with Big Data technologies like Hadoop, HDFS, Hive, Pig, Oozie, Sqoop, Map Reduce, Spark, Kafka, Storm, etc…
Java programming experience, including build tools like Maven
Some experience with cloud IaaS and PaaS platforms (one or more of the following: Azure, AWS, Rackspace, Heroku, CloudFoundry, OpenStack)
Experience with ETL tools like Informatica, SSIS, Talend, etc…
Experience with distributed systems, including developing solutions, administering, or operating
Experience with analytical programming and ability to work with other Data Architects to bridge the gap between a traditional DB architecture and a Hadoop centric architecture.
Experience in managing the full life-cycle of a Big Data solution including creating requirements, analysis, design of the technical architecture,development, testing, and deployment of the solution
Some experience with testing automation tools (unit, integration and acceptance tests)
Some experience with infrastructure automation tools (one or more of the following: Chef, Puppet, Saltstack, CloudFormation, Azure Resource Manager, Docker)
Experience with Continuous Integration and hands-on experience (tools like Jenkins, TeamCity, Maven, Ant, Bamboo, CircleCI, TravisCI)
Dependability
Self-motivation
Communication skills, verbal and written, to both technical and non-technical audiences
Client-focused attitude
Team player attitude with strong inclination to collaboration
Detail-oriented
Problem solving
Bachelor''s degree
o Advanced technical degree desired

Nice to Have
Experience with Microsoft Azure cloud platform
Experience with Azure HDInsight product
Experience working with other Apache open-source projects in the Hadoop ecosystem, or other open source projects within an enterprise environment
If you are comfortable with the requirement, please forward your profile to teja at keylent dot com or you can reach me on 4 0 7 4 8 2 1 4 9 3 .