Job Description :
Role: BigData and Hadoop Ecosystems Developer
Location: Dallas, TX

Required Skills
SOLR setup, installation, configuration and management SOLR collections and Index setup and optimization SOLR (Filters, Query Parsers, Performance Tuning, Group, Faceted Search)
Big data development experience on Hadoop platform including Hive, Hive LLAP, Sqoop, Flume, Spark SQL.
Loading data produced by Tools with kafka into Hive/Impala Tables.
We need to Find the best way to load the Data into Hive or Impala.
Understanding of Hadoop Architecture, design patterns, platform management tasks etc.
Performance analysis and debugging of slow running development and production processes.
Good knowledge in Python, Java and Shell scripting.
Experience with data modeling, complex data structures, data processing, data quality and data lifecycle. Good experience on Streamsets.
Good Knowledge on using Scala and Apache Spark