Job Description :
Job Description: (9+ Years only)

· Work in the digital delivery organization to partner with vendors to support delivery of data integration solutions utilizing tools such as Data bricks, Python custom development, Azure containers and other Azure tool kits. Must have a strong and continuously evolving technical mastery of RESTful API development focused on Python

· 70% - Building Data pipelines using Spark jobs for ETL development and data integrations to internal and external systems. Enhancements on existing data sources and integrations .Development of custom API's for inboud/outbound data integrations, Data Analysis for generating aggregation,

· 10% - Code deployments, Dev Ops, Reviews

Specific Knowledge & Skills needed for this position

· Expert knowledge of Big Data technologies including but not limited to Python and/or Databricks

· Strong Analytical and problem solving skills

· Knowledgeable in cloud platforms

· Proficiency in API security frameworks, token management and user access control including OAuth, JWT, etc

· Solid foundation and understanding of relational and NoSQL database principles