Job Description :

Title:Data Engineer with GCP

Location:Dallas, TX (Remote for Now)

Duration: 12 Months Contract

Job Description:

  • 4 to 6   years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github 
  • Experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud 
  • Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification 
  • Data migration experience from on prim legacy systems Hadoop, Exadata, Oracle Teradata, or Netezza to any cloud platform 
  • Experience with Data lake, data warehouse ETL build and design 
  • Experience in designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc. 
  • Experience in implementing next generation data and analytics platforms on GCP cloud 
  • Data engineering or Data profiling and Data warehousing. 
  • SQL or Spark. 

 

             

Similar Jobs you may be interested in ..