Job Description :

Job Description:

Candidate Profile:

  • 7-12 years experience on Data Engineering role working with Databricks & Cloud technologies.
  • Bachelor’s degree in computer science, Information Technology, or related field.
  • Strong proficiency in PySpark, Python, SQL.
  • Strong experience in data modeling, ETL/ELT pipeline development, and automation
  • Hands-on experience with performance tuning of data pipelines and workflows
  • Proficient in working on Azure cloud components Azure Data Factory, Azure DataBricks, Azure Data Lake etc.
  • Experience with data modeling, ETL processes, Delta Lake and data warehousing.
  • Experience on Delta Live Tables, Autoloader & Unity Catalog.
  • Preferred - Knowledge of the insurance industry and its data requirements.
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
  • Excellent communication and problem-solving skills to work effectively with diverse teams
  • Excellent problem-solving skills and ability to work under tight deadlines.

We are an equal opportunity employer. All aspects of employment including the decision to hire, promote, discipline, or discharge, will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, national origin, citizenship/ immigration status, veteran status, or any other status protected under federal, state, or local law.

             

Similar Jobs you may be interested in ..