Job Description :
1. Very good understanding of Event-processing pipelines using Kafka, Java, Zookeeper, Hadoop, Spark, Scala, S3, Spark Streaming.
2. Hands experience in writing code for Producers, Consumers, Event processing with in Kafka, Spark streaming. Good hands on experience in building applications using event driven framework with Kafka.
3. Able to install new Kafka clusters and troubleshoot Kafka related issues in production environment within given SLAs.
4. Work in Big Data Environment and familiar with Big data tools like Spark, Hive, Hbase etc.
5. Familiar with Cloud deployments and other AWS tools like S3, Kinesis Streams, Kinesis Firehose, AWS connect etc.
6. 8+ years of experience in building large scale enterprise integration implementation and web services and Microservices etc.
7. Using and developing event driven frameworks and REST services using databases, both RDBMS and NoSQL (e.g. Hadoop, MongoDB, etc Development experience with Java.
8. Hands-on experience in Java, Rest API, Kafka, Elasticsearch, SQL, AWS.
9. Ability to use JSON and/or XML formats for message. Able work with Avro, parquet formats.
10. Knowledge of version control such as Git / Bitbucket and Jenkins for builds Understanding of how to integrate code into automated deployment pipelines with CICD
             

Similar Jobs you may be interested in ..