Job Description :
Qualifications
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 4 years of relevant experience in Information Technology.

Preferred
Flume - Basic understanding of core core components: Event, Source, Sink, Channel, Agent, Client; Channel types; Source types; Interceptors; Selectors; Regular Expressions (regex)
Kafka – Basic understanding of core elements: Topic, Producer, Consumer, Broker; Role of Zookeeper; Partitioning; Replication; Offsets
Hadoop HDFS - Basic understanding of core core components: Name Node, Secondary Name Node, Data Node; HDFS storage system; Metadata; File size and Block size impacts; Replicas; Command line utilities; Sqoop
Spark - Basic understanding of core core components: Local Hadoop HDFS and Data Nodes, Node Manager, Resource Manager; RDDs; Executors; Virtual cores; Spark-submit and parameters
Splunk - Ability to search and create dashboards and alerts Linux - Understanding of Linux commands
Linux agents – Basic understanding of agent implementation in Linux
Log Rotation – Understanding of logrotate.d
Scripting – Understanding and ability to write shell scripts
Python - Basic understanding of code and execution
Screen - Basic understanding of the screen utility
Crontab - Understanding of crontab scheduling
Troubleshooting and Diagnostics
Strong inter-personal and communication skills
Ability to troubleshoot complex technical issues quickly and completely
Deep understanding of IP, TCP, UDP, SSL/TLS protocols
Must be customer- and result-oriented
Experience managing a production environment

Client : Infosys