Job Description :

Job Summary:

The Azure Databricks DevOps resource will create and maintain Continuous Integration/ Continues Delivery pipeline for Azure Databricks environment in Microsoft Azure. Initially, resource will be part of development team and will transition to Operations team after go-live to support Databricks DevOps pipeline and Operations.


Essential Job Functions:

  • Create Continuous Integration/Continuous delivery pipeline using Azure Databricks DevOps
  • Configure build agent & Set up the pipeline
  • Develop code and unit tests in Azure Databricks
  • Develop, Build & Release
  • Run automated tests
  • Operate: Programmatically schedule data engineering, analytics and machine learning workflows.


  • 1 years of Azure Databricks DevOps
  • 1 year of Azure Databricks development
  • 2 years of Azure experience
  • 2 years of Spark experience
  • 2-3 years of Python or R programming experience
  • Experience with Spark query tuning and performance optimizations.
  • Experience in data pipeline development in Sparks using Dataframes, structured streaming & Delta
  • Experience in strong data visualization skills and using related libraries in Python.
  • Experience in working with different file formats ( parquet, Avro, Json, CSV ) across distributed storage.

*Good to Have:*

  • Experience in eliciting, architecting and documenting business logic, ETL architecture, exception handling and fault tolerances in data pipelines.
  • Experience in working with messaging systems (Azure Event Hub, Kafka or similar)
  • Experience in working with version control systems (Github or similar) and DevOps methods.
  • Knowledge of Big Data ML toolkits, such as Azure ML, SparkML, or H2O is helpful.
  • Experience in designing and developing with NoSql and relational databases – MongoDB, Cosmo.

Other Responsibilities:

  • Document and maintain project artifacts.
  • Suggest best practices, and implementation strategies.
  • Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
  • Other duties as assigned.

Minimum Qualifications and Job Requirements:
Education: Must have a bachelor’s degree in Computer Science or related IT discipline



  • Must have at least 5 years of IT development and Operations experience.
  • Must have 3+ years relevant professional experience working with Azure or Big data or ETL.
  • Must have experience integrating web services
  • Knowledge of standard software development methodologies such as Agile and Waterfall
  • Strong communication skills.

Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary.


Specific knowledge, skills and abilities:

  • Ability to multitask with numerous projects and responsibilities
  • Experience working with JIRA and WIKI
  • Must have experience working in a fast-paced dynamic environment.
  • Must have strong analytical and problem-solving skills.
  • Must have excellent verbal and written communication skills
  • Must be able and willing to participate and contribute as individual as needed.

Must have ability to work the time necessary to complete projects and/or meet deadlines.


Similar Jobs you may be interested in ..