Job Description :
CP (Google Cloud Platform) Data Engineer
Location : 100% Remote ( CST Hours )
The primary skills required are Python data engineering and Google Cloud Platform experience.
Exp : Minimum 10 Years of experience Required 
Rate : 80$/hr C2C 

Description
As part of Client s Data Engineering team, you will be architecting and delivering highly scalable, high-performance data integration and transformation platforms. The solutions you will work on will include cloud, hybrid, and legacy environments that will require a broad and deep stack of data engineering skills. You will be using core cloud data warehouse tools, Hadoop, spark, events streaming platforms, and other data management-related technologies. You will also engage in requirements and solution concept development, requiring strong analytic and communication skills.
Responsibilities
  • Function as the solution lead for building the data pipelines to support the development/enablement of Information Supply Chains within our client organizations – this could include building (1) data provisioning frameworks, (2) data integration into the data warehouse, data marts, and other analytical repositories (3) integration of analytical results into operational systems, (4) development of data lakes and other data archival stores.
  • Optimally leverage the data integration tool components for developing efficient solutions for data management, data wrangling, data packaging, and integration. Develop overall design and determine the division of labor across various architectural components
  • Deploy and customize Daman Standard Architecture components
  • Mentor client personnel. Train clients on the Daman Integration Methodology and related supplemental solutions
  • Provide feedback and enhance Daman intellectual property related to data management technology deployments
  • Assist in the development of task plans including schedule and effort estimation

Qualifications
  • Strong experience with ETL patterns to load various analytical data stores (DW, ODS, data marts, data lakes, etc)
  • Deep understanding on building ETL frameworks using Python
  • Minimum 5 years of hands-on experience with Python
  • 1+ year experience working with BigQuery or DataProc. Demonstrated experience in Google Cloud
  • Experience building high-performance, and scalable distributed systems
  • Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required.
-- 
             

Similar Jobs you may be interested in ..