Job Description :
Position: Kafka Administrator Location: Houston, TX Duration: Full Time/Permanent Salary: $150K + Client paid benefits Job Description: The Kafka Streaming Platform Administrator is accountable for setting up and managing an enterprise Confluent Kafka environment on premise and in the cloud based on business and IT requirements of security, performance, supportability, auditability. This role is also responsible for ongoing monitoring of the streaming environment and provides support. The position reports to the Director of Analytic Platforms and is based in Houston, TX. Responsibilities may include: Solution consulting and building proof of concepts on Realtime solutions Message based integration Event based architectures Establishing, Documenting and evangelizing development standards Monitor the performance of the systems and ensure high uptime and provide 24/7 support. Deploy new and maintain existing Kafka cluster environments on Cloud or on-prem. Handle all Kafka environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring. Research and recommend innovative, and where possible, automated approaches for system administration tasks. Creation of key performance metrics, measuring the utilization, performance and overall health of the cluster. Proactively monitor and setup alerting mechanism for Kafka Cluster and supporting hardware to ensure system health and maximum availability Proven ability to lead multiple high priorities initiative with aggressive timelines leveraging an agile framework. Work closely with the various teams to make sure that all the Kafka topics are highly available and performing as expected. Work with Analytics Innovations COE and business analysts on designing and running technology proof of concepts on Kafka platforms Responsible for the security management of the platforms. Basic/Required: Legally authorized to work in the United States 5+ years of solid Hands-on Kafka Administration experience in managing critical 24/7 applications Design, build, assemble, and configure the application or technical architecture components using business requirements Proficiency in Confluent Kafka platform design, installation, operation, and the best practices for the below components: Kafka Brokers Kafka Zookeepers Confluent Kafka Connect/Connectors Confluent REST Proxy Confluent Schema Registry Confluent KSQL/KSQLDB Confluent Control Center Confluent Security Settings Confluent RBAC for Active Directory Authentication/Authorization Experience in using JMX for Kafka monitoring and performance tuning. Familiar with the Kafka message level, transportation level and data at rest security. Experience in implementing security & authorization (permission-based) on Kafka cluster. Experience in open-source Kafka, Zookeepers, Kafka connect, schema registry Avro schemas. High availability cluster setup, maintenance, and ongoing support Create topics, setup redundancy cluster, deploy monitoring tools, alerts Has good knowledge of best practices Hands-on experience in creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL Knowledge of Kafka API (development experience is a plus) Kafka DR/HA cluster setup experience including Cluster Replication settings is a plus Hands-on experience in Elastic ELK Stack Cluster setup, administration for Kafka Cluster Monitoring including integration with other systems like ServiceNow using the Elastic Webhook. Open to learning new technologies at a fast pace. Ability to document complex technical details about Kafka and other technologies as part of solution delivery. Preferred: Bachelor's Degree in Computer Science, MIS, Information Technology or other related technical discipline 1+ years of experience with AWS analytics stack 2+ years of experience architecting streaming applications/Solutions 1+ years of Oil and Gas Industry Experience Confluent Kafka Certification is a plus Knowledge of enterprise databases and/or data warehouse platforms such as Oracle, SQL Server or Teradata Automation experience with Python, PowerShell or a similar technology Experience with source control/automated deployment using CI/CD technologies. Useful technologies include Git/Jenkins/Ansible Experience with complex networking infrastructure including firewalls, VLANs, and load balancers Ability to work in a fast-paced environment independently with the customer with minimal supervision Ability to work with business and technology users to define and gather reporting and analytics requirements Strong analytical, troubleshooting, and problem-solving skills Takes ownership of actions and follows through on commitments by courageously dealing with important problems, holding others accountable, and standing up for what is right Delivers results through realistic planning to accomplish goals Generates effective solutions based on available information and makes timely decisions that are safe and ethical