Job Description :

Minimum of 3 years previous Consulting or client service delivery experience 
Minimum of 2 years of RDBMS experience
Minimum of 3 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, data lake and data warehouse solutions
Minimum of 3 years of hands-on experience in GCP, Python, Unix shell/Pearl scripting etc. - Experience providing practical direction with the GCP Native ecosystem
Hands-on experience implementing data migration and data processing using GCP services etc:
Data Ingestion : Cloud Pub/Sub, Data Transfer Service, Cloud IoT Core
Data Storage : Cloud Spanner, Cloud Storage, Cloud Datastore, Cloud SQL, Cloud Bigtable, Cloud Memorystore
Streaming Data Pipeline : Cloud Dataflow, Cloud Dataproc, Cloud Dataprep, Apache Beam
Data Warehousing & Data Lake : BigQuery, Cloud Storage
Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild, CoePipeline CodeDeploy, etc.
Bachelors or higher degree in Computer Science or a related discipline.
Nice-to-Have Skills/Qualifications:
DevOps on an GCP platform. Multi-cloud experience a plus.
Professional Skill Requirements
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Desire to work in an information systems environment
Good communication (written and oral) and interpersonal skills



Client : Implementer Need

             

Similar Jobs you may be interested in ..