Job Description :
This role will be responsible for all activities related to the design, operations, implementation, and 24x7 production support of United''s complex, highly-available Kafka and Hadoop ecosystem. Responsibilities include defining database standards and best practices for Kafka, Elastic, Hadoop and programming of data access layer. This role will align with architecture, engineering & application portfolios to deliver best-in-class systems in a hybrid data center environment to achieve business goals.

This is a hands-on role with team lead responsibilities and requires a strong drive to find the optimal solutions as well as mentoring team members and developers to evangelize Kafka and Elastic technology with client.

Job responsibilities
Designing, operationalizing, maintaining and scaling production Kafka clusters
Responsible for Kafka tuning, capacity planning, disaster recovery, replication, and troubleshooting.
Implementing Kafka security, limiting bandwidth usage, enforcing client quotas, backup and restoration.
In-depth understanding of the internals of Kafka cluster management, Zookeeper, partitioning, schema registry, topic replication and mirroring
Architect, configure, deploy and maintain Elasticsearch clusters.
Architect and deploy Elastic Cloud Enterprise
Configure Logstash and Beats to collect data necessary to meet client requirements.
Configure X-Pack plugins to include Security, Watcher, Machine Learning, Monitoring, Graph, and Reporting.
Design, implement, and configure Kibana visualizations and dashboards.
Modernizing ETL pipelines through the use of Kafka and Elastic.
Maintaining Kafka connectors to move data between systems.
Working with enterprise architects to advise on best practices for Kafka & Elasticsearch both on prem and in the cloud.
Experience training, mentoring and leading an emerging team is beneficial
Knowledge of Cloud Formation/Terraform scripts.
Work closely with Big Data and AWS cloud technology groups.
Experience with open source Kafka distributions as well as enterprise Kafka products preferred.
Familiarity with both cloud native Kafka (on AWS) and on-premise architectures.
Managing Kafka & Elasticsearch clusters & creating tools to automate and improve clusters'' reliability and performance.
Design and implement DR solutions for Confluent and Apache Kafka and Elasticsearch.
Lead and participate in OnCall rotation for production support.

Required
BS in Computer Science and/or equivalent work experience required
Strong data analysis skills; ability to independently write scripts/code to parse and analyze complex data
Excellent verbal and written communication skills
Communicate and work effectively in a team environment
3+ years of experience architecting and implementing Kafka clusters in a large-scale enterprise environment.
3+ years of experience with Elastic (Elastic Search, Logstash, Kibana) in a large-scale enterprise environment.
3+ years of experience supporting Linux environments.
3+ years of experience with Shell scripting.
Must be legally authorized to work in the United States for any employer without sponsorship
Successful completion of interview required to meet job qualification
Reliable, punctual attendance is an essential function of the position
             

Similar Jobs you may be interested in ..