Job Description :

Job Description:

  • Bachelor’s degree in Computer Science Engineering or related field or equivalent experience
  • Build and optimize data pipelines and ETL processes
  • Develop and maintain data lakes and warehouses (Azure, AWS, GCP, or Snowflake)
  • Integrate structured and unstructured data from multiple sources
  • Support AI/ML model training, deployment, and monitoring
  • Ensure data quality, integrity, and governance standards
  • Collaborate with data scientists on feature engineering and model readiness

We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.

             

Similar Jobs you may be interested in ..