Job Description :
This right candidate will be a knowledgeable, hands-on technologist with a strong background in Hadoop or Apache Spark based big data platform development. Additional expertise in the areas of developing OLAP cube in big data and business intelligence solutions is a plus. Candidate will also be part of architecture office responsible for establishing, driving, and implementing architectural principles and guidelines for information technology systems in accordance to business needs and adherence to corporate technology standards.
Working with database architects, developers, business analysts and subject matter experts to understand the complex technological system and migrate mainframe technologies to distributed and cloud platform
* Develop software using Hadoop technologies like HBase, Spark, Storm, NIFI and Kafka.
* Working on complex and varied Big Data projects including tasks such as collecting, parsing, managing, analyzing, and visualizing very large datasets.
* Translating complex functional and technical requirements into detailed designs.
* Writing high-performance, reliable and maintainable code.
* Performing data processing requirements analysis.
* Performance tuning for batch and real-time data processing.
* Securing components of clients'' Big Data platforms.
* Diagnostics and troubleshooting of operational issues.
* Health-checks and configuration reviews.
* Data pipelines development - ingestion, transformation, cleansing.
* Data flow integration with external systems.
* Integration with data access tools and products.
* Assisting application developers and advising on efficient data access and manipulations.
* Defining and implementing efficient operational processes
* Keeping up-to-date on new technology, standards, protocols and tools in areas relevant to the rapidly changing digital environment
* Monitoring emerging technology trends and managing an innovation lab that conducts proof-of-concept experiments with new and emerging technology