Databricks Developer

Databricks Developer

Job Type: Full time

Location : Sydney

Delivery Centric is seeking an experienced Databricks Developer to join our data engineering team. This role is ideal for someone with a strong background in designing, developing, and optimizing data solutions using the Databricks platform. The ideal candidate will have hands-on expertise with Lakehouse architecture, Delta tables, Spark technologies, and modern data engineering practices. You will work closely with cross-functional teams to gather requirements, build scalable pipelines, and ensure data solutions are secure, high-performing, and enterprise-ready.

Key Responsibilities

  • Design, develop, and maintain scalable data pipelines and solutions on the Databricks platform.
  • Implement and optimize Lakehouse Architecture using Delta Tables with schema evolution, ACID transactions, and version history.
  • Integrate Databricks with ETL and orchestration tools to support end-to-end data workflows.
  • Develop advanced data processing solutions using Spark SQL, DataFrame API, and Python/Scala.
  • Utilize Databricks native features including Workflows, Delta Live Tables, Unity Catalog, and job orchestration.
  • Configure and optimize workspaces, clusters, performance, and cost management.
  • Implement security best practices across data access, metadata management, and logging.
  • Build and support Spark Structured Streaming pipelines using Autoloader across multiple streaming platforms.
  • Contribute to data modeling activities across conceptual, logical, and physical layers.
  • Collaborate with architects, analysts, and platform teams to ensure high-quality solution delivery.

Qualifications

  • Minimum 5+ years of IT experience, with 3+ years of hands-on Databricks development.
  • Strong proficiency in Delta Tables, Lakehouse Architecture, schema evolution, and ACID principles.
  • Expertise in Apache Spark, DataFrame API, Spark SQL, and coding in Python or Scala.
  • Hands-on experience with Databricks features such as Unity Catalog, Workflows, Delta Live Tables.
  • Solid understanding of workspace and cluster configuration, performance tuning, and cost optimization.
  • Experience with Structured Streaming and Autoloader integrations.
  • Knowledge of data modeling best practices.
  • Nice to have: Exposure to AI/GenAI, Databricks or cloud certifications (Associate/Professional).

At Delivery Centric, you will work on innovative, large-scale data transformation initiatives that shape the future of digital enterprises. We provide a collaborative environment where you can expand your expertise in Databricks, cloud technologies, and advanced data engineering while contributing to impactful business outcomes.