Job Description :
10+ years of experience in data warehousing, operational data stores, and large scale architecture and implementation.
Experience in solutions crafting and developing proposals to illustrate business value provided by technical solutions.
Excellent communication and presentation skills.
Previous professional consulting experience.
Expertise in Big Data technologies in Hadoop ecosystem – Hive, HDFS, MapReduce, Yarn, Kafka, Pig, Oozie, HBase, Sqoop, Spark, etc.
Hands-on experience with Scala, Python and R.
A strong understanding of data analytics and visualization.
Experience with newer cloud based IM solutions and traditional EIM architectures (preferred
The ability to work with business groups to understand requirements, workflows and system trade-offs.
The ability to provide strategic and architectural direction to address unique business problems.
System design and modeling skills (e.g. domain driven design, data modeling, API design
Knowledge of message based patterns for orchestration and integration.
Knowledge of concurrency control and reliability in distributed systems
             

Similar Jobs you may be interested in ..