Job Description :
- Experience in Data Engineering and Business Intelligence.
- Proficient in IoT tools such as MQTT, Kafka, Spark.
- Proficient with AWS, S3, Redshift.
- Experience with Presto and Parquet/ORC.
- Proficient with Apache Spark and data frame.
- Experienced in containerization, including Docker and Kubernetes.
- Expert in tools such as Apache Spark, Apache Airflow, Presto.
- Expert in design and implement reliable, scalable, and performant distributed systems and data pipelines.
- Extensive programming and software engineering experience, especially in Java, Python,