Job Description :
Position: Sr. Data Engineer

Location: : Austin, TX 78759 (Remote)
Duration: 6+ months contract

Interview: Phone and Video

 

Required Skills:

Top 5 Must Haves:

  • Understanding of ETL and ELT processes
  • Data Warehousing experience, including Snowflake or other data warehousing tools
  • AWS/Cloud experience
  • Programming experience with data sets in Python, Scala, Java or C#
  • Apache Spark experience
  • 6+ Years


 
Job Overview
As a Senior Data Engineer, you will be a technical leader in a collaborative team environment that encourages you to perform at your best, while contributing to the engineering efforts of one of our data scrum teams. You will be challenged to engineer right-sized solutions for complex business problems focused on building patterns to support data pipelines, data warehousing, and data transformations. You will apply your knowledge of modern software design, best practices, design patterns, and frameworks, with an understanding of application performance and maintainability. You will aspire to use new technologies and challenge yourself to develop innovative solutions. You will work alongside developers and technical leads on a team where collaborative programming and mentoring is regularly practiced.
 
Technologies we use 

  • .NET Core, C#, Python
  • Oracle, Snowflake
  • Apache Spark
  • AWS
    • Compute: EC2, Lambda
    • Containers: ECS, EKS, Fargate, ECR
    • Data: S3, RDS, Aurora, DynamoDB, ElastiCache, DMS
    • Analytics: EMR, Elasticsearch
    • Networking: VPC, Route 53, API Gateway, Direct Connect
    • DevOps: CodeBuild, CodePipeline, CloudWatch
    • Messaging: SQS, SNS
  • Terraform
  • Tableau

 
Your Role:

  • Display a high level of critical thinking in bringing success to the organization
  • Influence technical solutions while coaching newer or less experienced members on your Scrum team
  • Construct and manage services which extract data from disparate databases, transform the data and loads into a Snowflake data warehouse
  • Collaborate with team members on best practices, code reviews, internal tools and process improvements
  • Evangelize new ideas within your team as well as across teams
  • Develop high performing, scalable and secure solutions
  • Plan and deliver core technology upgrades
  • Diagnose, design, and implement solutions to key technology or application problems

 
 
Qualifications:
Required 

  • 5+ years of professional software development experience
  • BA/BS degree in Computer Science or related field (or equivalent work experience in lieu of education)
  • Understanding of ETL and ELT processes
  • Data Warehousing experience, including Snowflake or other data warehousing tools
  • AWS/Cloud experience
  • Programming experience with data sets in Python, Scala, Java or C#
  • Understanding of SQL, relational databases, columnar data warehouses, and data modeling
  • Apache Spark experience
  • Experience with automated infrastructure tooling
  • A history of taking applications from conception/design to implementation/support
  • Experience designing and implementing applications with highly optimized and scalable architectures

 
 Preferred 

  • Experience with CDC (Change Data Capture) and parquet formats
  • Experience with data replication tools (Attunity, DeltaLake, or Hudi)
  • AWS Data Migration Services (DMS)
  • Amazon EMR
  • Experience developing on Oracle databases
  • Work monitoring and tuning EMR/Spark/YARN
  • Experience working with/implementing CI/CD pipelines
  • Experience with Terraform
  • Familiarity with BI tools such as Tableau
  • Strong understanding of industry development, deployment processes and agile development methodologies
  • Advanced degree in Computer Engineering/Science or related field 
             

Similar Jobs you may be interested in ..