% Off Udemy Coupon - CourseSpeak

CI/CD with Databricks Asset Bundles (DAB)

Build production-grade deployment pipelines with Databricks Asset Bundles. Package your Project as Code!

$9.99 (90% OFF)
Get Course Now

About This Course

<div>Are you ready to put DevOps and CI/CD to work in your Databricks deployments?</div><div><br></div><div>In this course, you’ll become an expert in Databricks Asset Bundles—the official “workspace-as-code” framework that brings true DevOps to your analytics platform. You’ll learn to bundle notebooks, jobs, pipelines, cluster specs, infrastructure and workspace configurations into a single, versioned package—and then automate its validation, testing, and multi-stage deployment through CI/CD pipelines. No more one-off clicks or hidden drift—just repeatable, reliable releases.</div><div><br></div><div><span style="font-size: 1rem;">High-Level Curriculum Overview</span></div><div><br></div><div>1. Introduction & Core Concepts</div><div><ul><li>Get oriented with Databricks Asset Bundles and CI/CD concepts. Review the course goals, the “infinite delivery loop,” and where to find code samples for each hands-on module.</li></ul></div><div><br></div><div>2. Environment & Setup</div><div><ul><li>Provision your Azure Databricks workspaces, configure VS Code, install the Databricks CLI, and prepare Databricks Connect for IDE-driven development.</li></ul></div><div><br></div><div>3. Asset Bundles Fundamentals</div><div><ul><li>Learn the core databricks bundles commands—init, validate, deploy, run, and destroy—and how to define, version, and manage your analytics project in databricks.yml.</li></ul></div><div><br></div><div>4. Local Development and Unit Testing</div><div><ul><li>Integrate PyTest for unit and integration tests, run tests via CI or Databricks Connect, and generate coverage reports to enforce quality gates.</li><li><span style="font-size: 1rem;">Understand how to switch between local PySpark for rapid unit testing and Databricks Connect to execute and debug code on real clusters, ensuring parity between your IDE and the cloud.</span></li></ul></div><div><br></div><div>5. Hands-On Projects</div><div><ul><li>Apply your knowledge in three practical hands-on projects:</li><li><span style="font-size: 1rem;">Notebook ETL pipelines (Bronze→Silver→Gold)</span></li><li><span style="font-size: 1rem;">Python script tasks and .whl-packaged jobs</span></li><li><span style="font-size: 1rem;">Delta Live Tables streaming pipelines</span></li></ul></div><div><br></div><div>6. Git Integration & CI/CD Pipelines</div><div><ul><li><span style="font-size: 1rem;">Onboard your project to Git, adopt branch-based workflows, and author GitHub Actions or Azure Pipelines to automate builds, tests, staging (with approval), and production rollouts.</span></li></ul></div><div><br></div><div><span style="font-size: 1rem;">By the end of this course, you’ll have an automated end to end CI/CD process for your entire Databricks environment.</span></div>

What you'll learn:

  • Package notebooks, jobs, and configurations as versioned code with Databricks Asset Bundles
  • Create automated CI/CD pipelines that deploy reliably from development to production
  • Build and distribute custom Python packages for use in your Databricks environment
  • Implement unit testing and validation for Databricks code
  • Set up GitHub Actions workflows for automated builds, tests, and deployments
  • Apply DevOps best practices to Databricks