Job Description :
Role: Java / Kafka Developer

Location: Houston,TX



Detail Requirement:



Very good understanding of Event-processing pipelines using Kafka, Java, Zookeeper, Hadoop, Spark, Scala, S3, Spark Streaming.
Hands experience in writing code for Producers, Consumers, Event processing with in Kafka, Spark streaming. Good hands on experience in building applications using event driven framework with Kafka.
Able to install new Kafka clusters and troubleshoot Kafka related issues in production environment with in given SLAs.
Work in Big Data Environment and familiar with Big data tools like Spark, Hive, Hbase etc.
Familiar with Cloud deployments and other AWS tools like S3, Kinesis Streams, Kinesis Firehose, AWS connect etc.
8+ years of experience in building large scale enterprise integration implementation and web services and Microservices etc.
Using and developing event driven frameworks and REST services using databases, both RDBMS and NoSQL (e.g. Hadoop, MongoDB, etc Development experience with Java.
Hands-on experience in Java, Rest API, Kafka, Elasticsearch, SQL, AWS.
Hands customer facing experience supporting external developers in complex partner integration projects.
Work in a fast-paced agile development environment
Fine tune Hadoop applications for high performance and throughput
Troubleshoot and debug any Hadoop ecosystem run time issues
Understanding enterprise Hadoop environment
Ability to use JSON and/or XML formats for message. Able work with Avro, parquet formats.
Ability to work as part of a Scrum team, following SAFe agile practices
Strong communication and collaboration skills Should be comfortable working in a rapidly transforming organization. 
Knowledge of version control such as Git / Bitbucket and Jenkins for builds Understanding of how to integrate code into automated deployment pipelines with CICD
             

Similar Jobs you may be interested in ..