Roles & Responsibilities:
-Design, develop, and maintain scalable and robust ETL/ELT processes and data pipelines using various tools and technologies.
-Build and optimize data warehouses, data lakes, and other data storage solutions to support analytical and operational needs.
-Implement data quality checks and monitoring to ensure the accuracy, completeness, and consistency of data.
-Work with large datasets, performing data modeling, schema design, and performance tuning.
-Create data models that are easy for BI tools to consume and build a dashboard.
Qualifications needed:
-Proficiency in Python, SQL, and data engineering concepts.
Technologies used:
-Internal tools
Platform/Tools used:
-Google Cloud Platform, BigQuery.
We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.