Job Description :

Sr Data Engineer –  6 months CA (REMOTE)


Snowflake consolidation and foundation almost complete. Now building Data Marts and Data Lakes.

 Must have:


Python – 5+ years - Using for warehousing pipelines and connectivity

SQL – 5+ years - Transforming data from various sources

AWS – 3+ years – Using Lambda functions, Dynamo Streaming. Experience is a plus but not a deal breaker. But must have AWS experience, such as EC2, S3, etc


Nice to haves: 

Redshift – 1+ years 

Experience with data pipelines – ELT vs ETL is a plus 

Apache Airflow is a “nice to have” 

DBT (If a person has at least 1 year of experience, GET THEM!)


Interview Process 

  1. Initial phone interview with the recruiter to verify experience with examples and technical screen to walk through Python code, give some Git/Bash commands and solve a riddle. 
  2. Manager – Culture, communication and Whiteboard SQL Challenges (Getting data and transforming) 



Similar Jobs you may be interested in ..