Job Description :

·         Processing data loading in Big Query from Google cloud storage using Google Data-Proc

·         Design, develop and deliver data integration/data extraction/data migration using Data Stage to GCP.

·         Scheduling end to end using GCP cloud composer service.

·         Migrating ETL jobs to Google Platform.

·         Maintaining BQ datasets for reporting requirements

·         Hands-on Google Big Query, Google Cloud Storage, Google Dataflow, Cloud SQL, Google Cloud Data Proc, Google Pub/Sub, Sqoop and Py-Spark.

·         Building and maintaining data catalog in GCP

·         Provisioning system, user and data level security for data in transit and rest

·         Expertise managing and working with large databases, data management, including understanding of various data structures and common methods in data transformation, data validation and audit.

·         Established team of Data Stewards, Data analysts and Data Scientists, providing actionable data insights to C-Sales and marketing.

·         Engage directly with the customers’ development team, understand their specific business and technology challenges in the area of distributed ledgers integration in new products and services.

·         Conducted one-to-few and one-to-many sessions to transfer knowledge

·         Engaged with customer and prospects to evaluate products perform POC’s and effectively communicate the key differentiators to stakeholders.

·         Experience presenting to all job levels in an org, both technical and non-technical, C-level to individual contributor.

·         Prepare and deliver customized solution and product demos.

·         Technical coach and mentor cross functional team members.


Similar Jobs you may be interested in ..