Job Description :

As a Data Engineer, you will assist in the development of data warehouse data flow/data warehouse functions on Google Cloud. An ideal candidate will be a proficient warehouse developer versed in data integration services development, report development, data analysis, and GCP Big Data stack.


·         Build Data Flow jobs in GCP: Supports technology tools, systems, capabilities, processes, and financials to enable delivery and drive business results across the enterprise.

  • Build/Supports Analytics & Data Lake, ensure communications both formal and informal are clear and aligned with supported functions and related stakeholders.
  • Technology/Business Plan Development: Supports execution of technology improvements that drive capacity within the enterprise.
  • Build Capability to Drive Growth and Eliminate Waste: Deploys tools, processes and resources to support the enterprise.


Experience with operationalization including security of Google BigQuery

Experience with all functional aspects of Google BigQuery/Big Table

Experience with Google Dataflow

Experience with Big Table

Experience with Postgres SQL

Experience with setting up secure access(ACLs, IAM roles) to Google Cloud Storage

Experience with writing Cloud Functions

Experience with Cloud Pub/Sub setup and configuration.

Years and Type of Experience

  • 3-5+ years of experience in data ingestion and storage systems for big data environment using at least one of the COTS integration tools, like - SnapLogic, webMethods, TIBCO, Talend, Informatica, and/or custom scripting in Python/Java
  • 2+ years of MUST have experience in using Apache Beam / Google Datafow / Apache Spark in creating end-to-end data pipelines
  • 2+ years of data engineering experience with big data environments and writing map-reduce jobs using Java/ Scala or Python.
  • 5+ years of webservice and API integration experience using REST API, JSON, Node.js, and Python
  • Experience in writing HIVE SQL and ANSI SQL based complex queries and data aggregation and transformation
  • Experience with continuous build, deployment, and team development environments, like - Jenkins, Chef, Puppet, JIRA etc.
  • Experience troubleshooting and taking responsibility for small features, from design to user delivery
  • Enthusiasm for the field and professional development/improvement outside the day to day jobs
  • Evidence of successfully managing diverse project teams (e.g., outsourced, multi-vendor, or geographically distributed).


Skills and Abilities

  • 2-3 years’ experience in Cloud Experience with connecting and integrating with at least one of the platform – Google Cloud,, Amazon AWS and/or various Data providers, like – Facebook or Tweeter API integration ) and Big Data technologies (Hadoop, Map Reduce) Expert level knowledge in at least 2 of these technologies - Relational Databases, Analytical Databases and NoSQL databases
  • Expert knowledge in SQL development
  • Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python etc.)
  • 2-3 years’ experience in designing and building medium to large scale data centric applications.
  • Experience in programming in Java, Python, etc.
  • Excellent understanding of software design and programming principles.
  • Hands on exp. in Google Big Query
  • Should be able to write optimal queries for cost optimization and UDF’s
  • Experience in ETL using Big query
  • At least some exp. with Linux shell
  • Should be aware of Google compute engine and google cloud storage
  • Hands on exp. in at least one among Google Dataflow, Dataproc or ML engine
  • Should know at least one visualization tool like Tableau, Power BI, Qlik View etc.
  • Should know intermediate Excel


  • BS in Computer Science or Engineering and 5+ years of overall relevant experience in application programming and integration
  • MS in Computer science or Engineering  nice to have.

Similar Jobs you may be interested in ..