1. 12+ years of experience in data engineering, with at least 3-5 years focused on MarTech, CDP, and data warehousing.
2. 8+ years of experience in data engineering, with at least 3-5 years focused on MarTech, CDP, and data warehousing.
Job Requirements:
• Bachelor's or master's degree in computer science, Information Systems, or related field
• Hands-on experience with Snowflake cloud data platform, including data ingestion, transformation, and orchestration.
• Strong background in building and maintaining data warehouse solutions on Snowflake.
• Proficiency in SQL, Python, or other programming languages for data processing and automation
• Experience with ETL/ELT tools, data pipeline development, and ApacheAirflow workflow management
• Proficiency in real-time data processing (Spark Streaming, Flink, Kafka Streams).
• Experience with cloud data warehouses Snowflake and data lakes (Delta Lake, Iceberg)
• Familiarity with NoSQL (MongoDB, Cassandra) and key-value stores (Redis, DynamoDB) is highly desirable.
• Experience with batch & streaming pipelines (Kafka, Kinesis, Pub/Sub).
• Experience with Azure cloud platforms, Azure Event Hubs and their integration with Snowflake
• Understanding of marketing technologies, customer data platforms, and data integration challenges
• Knowledge of data quality, data governance, and security practices in data engineering
Strong problem-solving skills and ability to optimize data processes for performance and scalability Good communication and teamwork skills to collaborate with data architects, analysts, and marketing teams
• Relevant certifications (e.g., Snowflake, Azure Cloud and Big Data) are a plus