Job Description :
Data Engineer
Remote
  • Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams
  • Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more business verticals. This data will be landed in GCP BigQuery.
  • Build cloud-native services and APIs to support and expose data-driven solutions.
  • Partner closely with our data scientists to ensure the right data is made available in a timely manner to deliver compelling and insightful solutions.
  • Design, build and launch shared data services to be leveraged by the internal and external partner developer community.
  • Building out scalable data pipelines and choosing the right tools for the right job. Manage, optimize and Monitor data pipelines.
  • Provide extensive technical, strategic advice and guidance to key stakeholders around data transformation efforts. Understand how data is useful to the enterprise.
  • Ability to work in a team environment - sharing ideas and working collaboratively
  • Strong organizational and analytical skills
  • Excellent interpersonal and written/verbal communication skills
  • Act as self-starter with the ability to take on complex projects and performs independent analysis

Must have skills:

  • 3+ years of hands-on experience in GCP with expertise in components such as Dataflow, Big Query, Airflow, GCS
  • 3+ years of experience in Python
             

Similar Jobs you may be interested in ..