Job Description :
Hands-on expertise with Big data technologies (HBase, Hive, Sqoop) - Experience with Pub/Sub messaging (JMS, Kafka, etc, stream processing (Storm, Spark Streaming, etc
Understanding and application of security best practices as they relate to big data technologies
Experience with horizontally scalable and highly available system design and implementation, with focus on performance and resiliency - Experience profiling, debugging, and performance tuning complex distributed systems
Experience with UNIX shell scripts and commands - Experience with data modeling
Ability to clearly document solution designs - Agile/Scrum methodology experience - Experience with ETL/ELT tools - Experience with BI solutions (Tableau, Microstrategy, D3 etc) Complexity Works on complex issues where analysis of situations and data requires an in-depth evaluation of variable factors.