Job Description:
Primary Skill : PySpark, DataBricks, Databricks Lakehouse
• 12+ years of overall experience in data engineering, cloud architecture, or analytics platforms. • Strong expertise in AWS big-data stack and Databricks Lakehouse. • Hands-on experience building enterprise-grade data platforms and multi-zone Lakehouse architectures. • Deep knowledge of: o Spark optimization o Delta Lake internals o Data modeling for Customer 360 / MDM o Data quality & observability • Strong understanding of zero-trust security, compliance, and enterprise governance. • Strong scripting/programming in Python, PySpark, SQL.