Job Description :
  • Bachelor’s degree in CS or related technical field.
  • 8+ years of experience in data modeling, data development, and data warehousing.
  • Experience working with Big Data technologies (Hadoop, Hive, Spark, Kafka, Kinesis).
  • Experience with large scale data processing systems for both batch and streaming technologies (Hadoop, Spark, Kinesis, Flink).
  • Experience in programming using Python, Java or Scala.
  • Experience with data orchestration tools (Airflow, Oozie, Step Functions).
  • Solid understanding of database technologies including NoSQL and SQL.
  • Track record of delivering reliable data pipelines with solid test infrastructure, CICD, data quality checks, monitoring, and alerting.
             

Similar Jobs you may be interested in ..