Job Description :
Technical Experience
Cloud platform technologies such as Microsoft Azure, Amazon Web Services and Google Cloud.
Hadoop distributions such as Cloudera, Hortonworks
Big Data Analytic frameworks and query tools such as Spark, Storm, Hive, HBase, Impala, Hue
Streaming data tools and techniques such as Kafka, AWS Kinesis, Microsoft Streaming Analytics
ETL (Extract-Transform-Load) tools such as Pentaho, Talend, Informatica, StreamSets); also experience with ELT
Continuous delivery and deployment using Agile Methodologies.
Data Warehouse and DataMart design and implementation
NoSQL environments such as MongoDB, Cassandra
Data modeling of relational and dimensional databases
Metadata management, data lineage, data governance, especially as related to Big Data
Structured, Unstructured, Semi-Structured Data techniques and processes

Qualifications
Over 10 years of engineering and/or software development experience and demonstrable architecture experience in a large organization.
Experience should contain 5+ years of experience of architecture support combined of these environments: warehouse, DataMart, business intelligence, and big data.
5+ years of consulting experience desired
3+ years of hands-on experience in Big Data Components/Frameworks including: Hadoop/HDFS, Spark, Storm, HBase, Pig, Hive, Scala, Kafka, PyScripts, Unix Shell scripts
3+ years of hands-on experience configuring and implementing solutions on cloud platforms such as Azure, AWS, or Google Cloud.
Experience in architecture and implementation of large and highly complex projects
History of working successfully with cross-functional engineering teams
5+ years experience in one of the following business domains: Manufacturing, Cable/Telecom, Finance and Supply Chain
Demonstrated ability to communicate highly technical concepts in business terms and articulate business value of adopting Big Data technologies
             

Similar Jobs you may be interested in ..