Job Description :

Understanding of Rest APIs, Big Data Processing, Rules Engines to orchestrate the calls to the Rest APIs and other data sources like Kafka, Snowflake, AWS S3  

-        Strong implementation background and experience with AWS Services (Preferably : Lambdas, step functions , firehose , Kinesis , EMR , Glue ETL , Dynamodb , Neptune , route53 , Application Load balancers , ECS , SQS )

-        Knowledge of Corporate Governance on Cloud service usage and Security measures.

-        Experience in streaming technologies such as Kafka

-        Experience with Automating Testing and deployment pipelines using devops tools such as Jenkins

-        Strong Data Engineering background with the ability to implement Spark based or AWS Lambda based or glue based ETL jobs

-        Experienced or Familiar with Batch Jobs orchestration tools Apache Airflow

-        Experience with working on Big Data Warehouse platforms such Snowflake

-        Experience with Production Support who can be on call and help troubleshoot errors on the applications and fix the issues. 

-        AWS certification preferred

             

Similar Jobs you may be interested in ..