Job Description :
 
 

Databricks Engineer

We are seeking an experienced Databricks Administrator with strong expertise in big data and cloud technologies to design, implement, and maintain enterprise-level data solutions. This role involves leading complex projects, providing technical guidance, and ensuring alignment with organizational IT strategies.

 

Key Responsibilities

  • Application Development & Integration (20%)
    • Analyze, design, program, and implement complex applications and projects.
    • Provide expertise on integration of applications across the business.
    • Design, code, test, and debug advanced application programs.
  • Consulting & Technical Leadership (15%)
    • Act as an internal consultant, mentor, and change agent.
    • Collaborate with customers, business analysts, and team members to define requirements.
    • Ensure solutions comply with architectural standards and business strategies.
  • Design & Architecture (15%)
    • Provide design recommendations aligned with long-term IT strategy.
    • Participate in component and data architecture design.
    • Make recommendations for new code development or reuse of existing code.
  • Enterprise Solutions Development (15%)
    • Develop enterprise-level applications and custom integration solutions.
    • Evaluate interrelationships between programs and systems for impact analysis.
  • Standards & Best Practices (15%)
    • Develop programming and development standards.
    • Innovate new data sources and techniques.
    • Ensure timely delivery of application software within budget.
  • Project Leadership & Support (15%)
    • Lead and coordinate complex projects or phases of large projects.
    • Provide guidance to junior programmers/analysts.
    • Troubleshoot and resolve system issues.
  • Testing & Documentation (5%)
    • Test programs, verify logic, debug, and prepare documentation.
 

Required Skills & Abilities

  • Advanced understanding of application development, QA, and integration methodologies.
  • Proficiency in programming languages, platform capabilities, and system analysis.
  • Strong analytical, decision-making, and problem-solving skills.
  • Excellent verbal and written communication skills.
  • Ability to work under pressure and in team environments.
  • Familiarity with project management concepts.
  • Attention to detail and ability to maintain effective working relationships.
 

Technical Requirements

Required Technologies:

  • Big Data technologies
  • Cloud-based technologies (AWS Services): State Machines, CDK, Glue, TypeScript, CloudWatch, Lambda, CloudFormation, S3, Glacier, DataSync, Lake Formation, AppFlow, RDS PostgreSQL, Aurora, Athena, Amazon MSK, Apache Iceberg, Spark, Python

Preferred (Nice-to-Have):

  • AWS Redshift
  • Databricks (Delta Lake, Unity Catalog)
  • Data Engineering and processing using Databricks
  • AI/ML tools: Amazon Bedrock, AWS SageMaker, R Studio / Posit Workbench, R Shiny / Posit Connect
  • Additional tools: Kafka, Hive, Hue, Oozie, Sqoop, Git/Git Actions, IntelliJ, Scala
 

Education & Experience

  • Education: Bachelor's degree in Computer Science, Information Technology, or related field
    OR equivalent experience (4 years job-related work experience or 2 years plus associate degree)
  • Experience: Minimum 8 years in application development, systems testing, or related roles.
 

Work Environment

  • Customer-focused, project-oriented programming environment.
  • Fast-paced, multi-platform environment requiring occasional 24x7 support.
             

Similar Jobs you may be interested in ..