Job Description :
Analyze latest Big Data Analytic technologies and their innovative applications in both business intelligence analysis and new service offerings; bring these insights and best practices to Hitachi Consulting’s Insights and Analytics practice
Architect and implement complex IoT data analytic solutions
Stand up and expand data as service collaboration with partners in US and other international markets
Develop highly scalable and extensible Big Data platforms which enable collection, storage, modeling, and analysis of massive data sets including those from IoT and streaming data
Drive architecture engagement models and be an ambassador for partnership with IT delivery and external vendors.
Effectively communicate complex technical concepts to non-technical business and executive leaders
Lead large and varied technical and project teams
Assist with scoping, pricing, architecting, and selling large project engagements

Technical Experience
Cloud platform technologies such as Microsoft Azure, Amazon Web Services and Google Cloud.
On premises Big Data platforms such as Cloudera, Hortonworks
Big Data Analytic frameworks and query tools such as Spark, Storm, Hive, Impala
Streaming data tools and techniques such as Kafka, AWS Kinesis, Microsoft Streaming Analytics
ETL (Extract-Transform-Load) tools such as Pentaho or Talend or Informatica); also experience with ELT
Continuous delivery and deployment using Agile Methodologies.
Data Warehouse and DataMart design and implementation
NoSQL environments such as MongoDB, Cassandra
Data modeling of relational and dimensional databases
Metadata management, data lineage, data governance, especially as related to Big Data
Structured, Unstructured, Semi-Structured Data techniques and processes

Minimum Requirements
Over 10 years of engineering and/or software development experience and demonstrable architecture experience in a large organization.
Experience should contain 5+ years of experience of architecture support combined of these environments: warehouse, DataMart, business intelligence, and big data.
5+ years of consulting experience desired
Hands-on experience in Big Data Components/Frameworks such as Hadoop, Spark, Storm, HBase, HDFS, Pig, Hive, Scala, Kafka, PyScripts, Unix Shell scripts
Experience in architecture and implementation of large and highly complex projects
Deep understanding of cloud computing infrastructure and platforms
History of working successfully with cross-functional engineering teams
Experience in business domains like Manufacturing, Communications, Finance and Supply Chain
Demonstrated ability to communicate highly technical concepts in business terms and articulate business value of adopting Big Data technologies
Bachelor’s degree in a technical field such as Computer Science, Mathematics, Physics, Engineering, Economics, or similar from a four-year college or university. Master’s degree or higher preferred