Job Description :
Data Engineer
Location: Charles Schwab in Arizona
We need 5 GCP/Data Engineers for a TEKsystems Global Services project team at Charles Schwab in Arizona.
Brief details below…
GCP security and terraform experience are important for these roles. Will share job descriptions today or tomorrow once they are completed
Target start date is February 8th
Initial length is 5 months with high likelihood of extension

As a Data engineer (successfully lead the delivery of projects including end-to-end requirements,
data analysis, perform data engineering activities and work with the users for data validation.
5+ years of experience in Datawarehouse or Data Analytics
5+ years of Hands on experience in setting up Google Cloud Data Engineering
5+ years of Strong Hands on experience in any programming (Python, Spark)
Professional Cloud Data Engineer Certification is a plus
Requisite Abilities and/or Skills
Perform requirement gathering with the business users and SME and build a plan
Perform ETL and data engineering work by leveraging multiple google cloud components using
Cloud Dataflow, Cloud Data Proc, Google BigQuery
Knowledge on Data Modelling and reporting using Google Cloud BigQuery
Knowledge on building data pipelines leveraging GCP best methodologies
Strong understanding towards Kubernetes, docker containers and to deploy GCP services
Experience in writing code to extract, transform data from multiple data sources including
Experience in ETL tools like Informatica or any ETL tools
Experience in scheduling like Airflow, Cloud Composer etc.
Experience in CI/CD automation pipeline facilitating automated deployment and testing
Excellent verbal and written communication, problem solving and interpersonal skills.
Experience in data catalog and metadata management.
Experience with JIRA or any other Project Management Tools
Deliver end to end comprehensive documentation along with code samples
Experience in one or multiple scripting languages / Cloud solutions is a plus
Key Accountabilities and Priorities:
Build a and operationalize pipelines to include data acquisition, staging, integration of new data
sources, cataloging, cleansing, batch and stream processing, transformation, and consumption
Independently work on assigned projects and foster a collaborative environment for a high-
performing team
Gather Business requirements, Review business priorities, Analyze options & risks
Quickly Understand and formulate application level requirements pertaining to complex workflows,
or integration with external applications
Work Location:
- TEKsystems Dallas office/
Additional Information:
Bachelor’s Degree. Masters is a plus.
8+ years’ experience in Information Technology and/or in IT Professional Services.
             

Similar Jobs you may be interested in ..