Job Description :
Job Title Hadoop Developer
Relevant Experience
(in Yrs) 6 to 8 years

Technical/Functional Skills

Must have Big Data, Hadoop and related technologies
Good to have SQL Server, Data Warehouse background
Experience Required • Develop software using Hadoop technologies like HBase, Spark, Sqoop, NIFI and Kafka
Utilizing programming languages like Java, Scala, Python with an emphasis on tuning, optimization and best practices for application developers
Roles & Responsibilities • Looking for a data developer who has Hadoop development and implementation experience. This individual will be involved in the design and implementation of fault-tolerant and scalable pipelines for diverse data sources. This position will develop solutions to process data from a variety of sources which may include mobile devices, web click-stream, various 3rd party, and operational system data. Processes and applications will move data as batch and in near-real-time.
As a part of big data team, he will assist in the development in the architecture and platforming of data in order for analysts and business users access data for their analytics and insight. He will work directly and indirectly with analysts and business users to develop appropriate solutions to bring business value.
Analyze requirements to successfully support design activities
• Design and build integration components and interfaces in collaboration with Architects and Infrastructure Engineers
• Perform all technical aspects of software development (write, test, support)
• Perform unit, component, integration testing of software components including the design, implementation, evaluation, and execution
• Conduct code reviews and tests of automated build scripts
• Debug software components, identify, fix and verify remediation of code defects - own work and the work of others
• Work with product owners to prioritize features for ongoing sprints
• Manage a list of technical requirements based on industry trends, new technologies, known defects, and issues
• Custom data pipeline development (cloud and locally hosted); work heavily within the Hadoop ecosystem
• Experience within Insurance, Financial Services, or other regulated industries