Job Description :
Job #164922
Big Data (Event Streaming components of Analytics)
Columbus - 266007
Duration: 6 months

Responsibilities:
Responsible for owning Event Streaming components of Analytics Develop components to interact with big data system like Hadoop, Cassandra, Kafka etc.
Own Analytics event streaming DPS applications and architect/design new integrations.
Design, build & maintain highly scalable, low-latency, fault-tolerant streaming data streaming platform.
Required Skills/Experiences:
Deep experience at least in one language Java, Scala, Python etc.
Experience building batch, real-time and streaming data pipelines with data from event data streams, NoSQL or APIs.
Deep experience with building and shipping highly scalable distributed systems on cloud platforms.
Preferred Skills/Experiences:
Deep experience in distributed stream processing frameworks: Kafka, Kinesis, Flink and Spark Streaming.
Experience with the Big Data ecosystem (Hadoop/Hive/Spark/Presto/Airflow
Proven ability to learn new technologies quickly

Client : technocraft