Job Description :
Description:
Title: Hadoop Specialist
Minimum Requirements:
* Bachelors or Masters Degree in Computer Science
* Minimum of ten (10) years of experience in complex data warehouse, Big data and Data Lake environments and data analytics solutions
* Expert experience in dimensional data modeling
* Experience architecting traditional and big data solutions
* Knowledge and experience with SQL and NoSQL databases
* Experience with Hadoop, Spark, Kafka, HBase, Columnar Stores, RDBMS, R, Stream Processing, and the various AWS managed service offerings
* Proficient in Teradata, hands-on experience with ETL tools and BI tools, particularly, Microstrategy
* Programming capabilities in common languages (Spark, SQL, Python, Bash, PHP, PERL, Java etc
* Experience in implementations of Enterprise & Semantic Search, NLP, NLTK, Mahout, ML, Text Mining & Analytics (Entity Extraction, Relationship Extraction, Dependency Parsing, Taxonomies), Ontologies Modelling and exploratory data analysis.
* Experience in Ability to Dig deeper into data, understand characteristics of data, evaluate alternate models and validate hypothesis through theoretical and empirical approaches
Desirable experience in Search Technology - Solr, Lucene
Desirable experience in Tensor Flow, Blockchain technologies

Soft skills:
* Ability to mentor, coach and build talent
* Influence and change organizational behavior
* To provide the leadership and direction when situation demands
* Can perform independently with minimal supervision and make firm decisions in ambiguity
* Ability to influence and communicate across key stakeholders
* Demonstrate a bias for speed and execution
* Drive change management of new data capabilities and architectural direction
* Ability to multitask in demanding environments
* Creative problem solver with sound judgment and high degree of integrity
* Possess critical thinking skills
             

Similar Jobs you may be interested in ..