Job Description :
Responsibilities
Perform architecture design and implementation of IoT and Big Data applications for clients
Develop highly scalable and extensible Big Data platforms which enable collection, storage, modeling, and analysis of massive data sets including those from IoT and streaming data
Drive architecture engagement models and be an ambassador for partnership with IT delivery and external vendors.
Effectively communicate complex technical concepts to non-technical business and executive leaders
Assist with scoping, pricing, architecting, and selling large project engagements

Qualifications
Over 10 years of engineering and/or software development experience and demonstrable architecture experience in a large organization.
Experience should contain 5+ years of experience of architecture support combined of these environments: Data Warehouse, Data Mart, Business Intelligence, and Big Data.
5+ years of consulting experience desired
Demonstrated experience of architecting and implementing IoT architecture and analytical systems in various project scenarios
Hands-on experience with Google cloud platform technologies: Google Cloud Platform IoT Core and Pub/Sub (Streaming), Cloud Functions, DataFlow (Apache Beam), DataProc (Hadoop, Spark, Hive), Cloud Machine Learning, Cloud Data Store and BigTable, BigQuery, DataLab, and DataStudio
Engineering and/or software development experience and demonstrable architecture experience at enterprise scale
Prior technology consulting experience desired
Experience in architecture and implementation of large and highly complex projects
Deep understanding of cloud computing infrastructure and platforms
History of working successfully with cross-functional engineering teams

Optional additional experience:
Experience with other technologies:
Streaming data tools and techniques such as Apache Kafka, Azure Streaming Analytics, AWS Kinesis
Cloud gateways, queues, messaging hubs such as Hub, MQTT, XMPP, CoAP
Big Data platforms e.g. Cloudera, Hortonworks MapR
Visualization: MS PowerBI, Tableau, Pentaho BA Suite
Big Data Analytic frameworks and query tools such as: HDINsight, Spark, Storm, Hive, Impala
ETL (Extract-Transform-Load) tools such as Pentaho PDI , Talend , Informatica; also experience with ELT
NoSQL environments such as MongoDB, Cassandra
Machine Learning (R, Scala, Python)
Continuous delivery and deployment using Agile Methodologies.
Data Warehouse and DataMart design and implementation
Data modeling of relational and dimensional databases
Metadata management, data lineage, data governance, especially as related to Big Data
Structured, Unstructured, Semi-Structured Data techniques and processes
             

Similar Jobs you may be interested in ..