Job Description :
 
 
Only W2
 
Job Description

Job Title/Role: Senior Data Engineer

Location: Austin, TX (Day 1 onsite) - Locals Only

Experience Level Required: 8+ years

Mandatory Required Skills:

  • Strong experience with SQL, Python, and other data engineering technologies
  • Experience with cloud computing platforms such as AWS, Azure, or GCP
  • Experience with big data technologies such as Hadoop, Spark, and Kafka
  • Experience with data warehousing and data mart technologies such as Snowflake, Redshift, and BigQuery
  • Experience with data security and governance best practices
  • Excellent problem-solving and analytical skills
  • Strong communication and collaboration skills

Preferred/Desired Skills:

  • Experience with machine learning and deep learning
  • Experience with data visualization tools such as Tableau and Power BI
  • Experience with data streaming technologies such as Kafka and Flink

Job Description:

We are seeking a Senior Data Engineer to join our team in Austin, TX, and contribute to building and maintaining our data infrastructure and pipelines. As a Senior Data Engineer, you will play a key role in designing, developing, and implementing scalable and reliable data solutions to meet our business needs. Collaboration with data scientists and analysts is crucial to understand their data requirements and develop effective solutions.

Key Responsibilities:

  • Design, develop, and implement scalable and reliable data pipelines and architectures.
  • Collaborate with data scientists and analysts to understand their data needs and develop solutions accordingly.
  • Build and maintain data warehouses and data marts.
  • Optimize data pipelines and architectures for performance and efficiency.
  • Implement data security and governance measures.
  • Monitor and troubleshoot data pipelines and architectures.
  • Stay up-to-date on the latest data engineering technologies and trends.

Qualifications:

  • 8+ years of experience in a data engineering role.
  • Strong experience with SQL, Python, and other data engineering technologies.
  • Experience with cloud computing platforms such as AWS, Azure, or GCP.
  • Experience with big data technologies such as Hadoop, Spark, and Kafka.
  • Experience with data warehousing and data mart technologies such as Snowflake, Redshift, and BigQuery.
  • Experience with data security and governance best practices.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.

Bonus Points:

  • Experience with machine learning and deep learning.
  • Experience with data visualization tools such as Tableau and Power BI.
  • Experience with data streaming technologies such as Kafka and Flink.
             

Similar Jobs you may be interested in ..