Job Description :
Spark Scala + hive or Python Spark + hive
Responsible for designing, deploying, and maintaining mission critical security analytics data environment that process data quickly at large scale.
Fluency in common query languages, API development, data transformation, and integration of data streams.
Strong experience with large dataset platforms such as (Spark, hive, Impala etc
Fluency in multiple programming languages, such as Python, Shell Scripting, Regex, SQL, Java, or similar languages and tools appropriate for large scale data processing.
Must have basic Linux administration skills and Multi-OS familiarity (Windows / OSX)
Creativity to go beyond current tools to deliver best solution to the problem
Experience in producing and consuming topics to/from Apache Kafka.
             

Similar Jobs you may be interested in ..