KS
Killer-Skills

asset-bundles — how to use asset-bundles how to use asset-bundles, what is asset-bundles, Databricks Asset Bundle setup guide, asset-bundles vs Databricks notebooks, asset-bundles install, Databricks multi-environment deployment, Spark Declarative Pipeline configurations, asset-bundles tutorial, Databricks Asset Bundle best practices

v1.0.0
GitHub

About this Skill

Ideal for Databricks-focused AI Agents requiring seamless multi-environment deployment capabilities for Spark Declarative Pipelines asset-bundles is a Databricks Asset Bundle (DAB) writer that facilitates multi-environment deployment through a structured project layout and configuration files like databricks.yml and resource definitions.

Features

Creates Databricks Asset Bundles (DABs) for multi-environment deployment
Supports Spark Declarative Pipeline configurations via SDP_guidance.md
Includes SQL Alert schemas for critical alerts with API differences
Utilizes a structured project layout with databricks.yml and resource definitions
Enables deployment across dev, staging, and prod environments
References alerts_guidance.md for SQL Alert configurations

# Core Topics

juanlamadrid20 juanlamadrid20
[0]
[0]
Updated: 3/7/2026

Quality Score

Top 5%
60
Excellent
Based on code quality & docs
Installation
SYS Universal Install (Auto-Detect)
Cursor IDE Windsurf IDE VS Code IDE
> npx killer-skills add juanlamadrid20/dbrx-multi-agent-retail-intelligence/SDP_guidance.md

Agent Capability Analysis

The asset-bundles MCP Server by juanlamadrid20 is an open-source Categories.community integration for Claude and other AI agents, enabling seamless task automation and capability expansion. Optimized for how to use asset-bundles, what is asset-bundles, Databricks Asset Bundle setup guide.

Ideal Agent Persona

Ideal for Databricks-focused AI Agents requiring seamless multi-environment deployment capabilities for Spark Declarative Pipelines

Core Value

Empowers agents to create and manage Databricks Asset Bundles (DABs) for streamlined deployment across dev, staging, and prod environments, leveraging Spark Declarative Pipeline configurations and SQL Alert schemas

Capabilities Granted for asset-bundles MCP Server

Deploying Databricks applications across multiple environments
Managing Spark Declarative Pipelines for consistent workflow execution
Creating and configuring SQL Alerts for critical notifications

! Prerequisites & Limits

  • Requires Databricks environment setup
  • Specific to Databricks and Spark Declarative Pipelines
  • Needs careful configuration of databricks.yml and resource definitions
Project
SKILL.md
9.1 KB
.cursorrules
1.2 KB
package.json
240 B
Ready
UTF-8

# Tags

[No tags]
SKILL.md
Readonly

Databricks Asset Bundle (DABs) Writer

Overview

Create DABs for multi-environment deployment (dev/staging/prod).

Reference Files

Bundle Structure

project/
├── databricks.yml           # Main config + targets
├── resources/*.yml          # Resource definitions
└── src/                     # Code/dashboard files

Main Configuration (databricks.yml)

yaml
1bundle: 2 name: project-name 3 4include: 5 - resources/*.yml 6 7variables: 8 catalog: 9 default: "default_catalog" 10 schema: 11 default: "default_schema" 12 warehouse_id: 13 lookup: 14 warehouse: "Shared SQL Warehouse" 15 16targets: 17 dev: 18 default: true 19 mode: development 20 workspace: 21 profile: dev-profile 22 variables: 23 catalog: "dev_catalog" 24 schema: "dev_schema" 25 26 prod: 27 mode: production 28 workspace: 29 profile: prod-profile 30 variables: 31 catalog: "prod_catalog" 32 schema: "prod_schema"

Dashboard Resources

yaml
1resources: 2 dashboards: 3 dashboard_name: 4 display_name: "[${bundle.target}] Dashboard Title" 5 file_path: ../src/dashboards/dashboard.lvdash.json # Relative to resources/ 6 warehouse_id: ${var.warehouse_id} 7 permissions: 8 - level: CAN_RUN 9 group_name: "users"

Permission levels: CAN_READ, CAN_RUN, CAN_EDIT, CAN_MANAGE

Pipelines

See SDP_guidance.md for pipeline configuration

SQL Alerts

See alerts_guidance.md - Alert schema differs significantly from other resources

Jobs Resources

yaml
1resources: 2 jobs: 3 job_name: 4 name: "[${bundle.target}] Job Name" 5 tasks: 6 - task_key: "main_task" 7 notebook_task: 8 notebook_path: ../src/notebooks/main.py # Relative to resources/ 9 new_cluster: 10 spark_version: "13.3.x-scala2.12" 11 node_type_id: "i3.xlarge" 12 num_workers: 2 13 schedule: 14 quartz_cron_expression: "0 0 9 * * ?" 15 timezone_id: "America/Los_Angeles" 16 permissions: 17 - level: CAN_VIEW 18 group_name: "users"

Permission levels: CAN_VIEW, CAN_MANAGE_RUN, CAN_MANAGE

⚠️ Cannot modify "admins" group permissions on jobs - verify custom groups exist before use

Path Resolution

⚠️ Critical: Paths depend on file location:

File LocationPath FormatExample
resources/*.yml../src/...../src/dashboards/file.json
databricks.yml targets./src/..../src/dashboards/file.json

Why: resources/ files are one level deep, so use ../ to reach bundle root. databricks.yml is at root, so use ./

Volume Resources

yaml
1resources: 2 volumes: 3 my_volume: 4 catalog_name: ${var.catalog} 5 schema_name: ${var.schema} 6 name: "volume_name" 7 volume_type: "MANAGED"

⚠️ Volumes use grants not permissions - different format from other resources

Apps Resources

Apps resource support added in Databricks CLI 0.239.0 (January 2025)

Apps in DABs have a minimal configuration - environment variables are defined in app.yaml in the source directory, NOT in databricks.yml.

Generate from Existing App (Recommended)

bash
1# Generate bundle config from existing CLI-deployed app 2databricks bundle generate app --existing-app-name my-app --key my_app --profile DEFAULT 3 4# This creates: 5# - resources/my_app.app.yml (minimal resource definition) 6# - src/app/ (downloaded source files including app.yaml)

Manual Configuration

resources/my_app.app.yml:

yaml
1resources: 2 apps: 3 my_app: 4 name: my-app-${bundle.target} # Environment-specific naming 5 description: "My application" 6 source_code_path: ../src/app # Relative to resources/ dir

src/app/app.yaml: (Environment variables go here)

yaml
1command: 2 - "python" 3 - "dash_app.py" 4 5env: 6 - name: USE_MOCK_BACKEND 7 value: "false" 8 - name: DATABRICKS_WAREHOUSE_ID 9 value: "your-warehouse-id" 10 - name: DATABRICKS_CATALOG 11 value: "main" 12 - name: DATABRICKS_SCHEMA 13 value: "my_schema"

databricks.yml:

yaml
1bundle: 2 name: my-bundle 3 4include: 5 - resources/*.yml 6 7variables: 8 warehouse_id: 9 default: "default-warehouse-id" 10 11targets: 12 dev: 13 default: true 14 mode: development 15 workspace: 16 profile: dev-profile 17 variables: 18 warehouse_id: "dev-warehouse-id"

Key Differences from Other Resources

AspectAppsOther Resources
Environment varsIn app.yaml (source dir)In databricks.yml or resource file
ConfigurationMinimal (name, description, path)Extensive (tasks, clusters, etc.)
Source pathPoints to app directoryPoints to specific files

⚠️ Important: When source code is in project root (not src/app), use source_code_path: .. in the resource file

Other Resources

DABs supports schemas, models, experiments, clusters, warehouses, etc. Use databricks bundle schema to inspect schemas.

Reference: DABs Resource Types

Common Commands

Validation

bash
1databricks bundle validate # Validate default target 2databricks bundle validate -t prod # Validate specific target

Deployment

bash
1databricks bundle deploy # Deploy to default target 2databricks bundle deploy -t prod # Deploy to specific target 3databricks bundle deploy --auto-approve # Skip confirmation prompts 4databricks bundle deploy --force # Force overwrite remote changes

Running Resources

bash
1databricks bundle run resource_name # Run a pipeline or job 2databricks bundle run pipeline_name -t prod # Run in specific environment 3 4# Apps require bundle run to start after deployment 5databricks bundle run app_resource_key -t dev # Start/deploy the app

Monitoring & Logs

View application logs (for Apps resources):

bash
1# View logs for deployed apps 2databricks apps logs <app-name> --profile <profile-name> 3 4# Examples: 5databricks apps logs my-dash-app-dev -p DEFAULT 6databricks apps logs my-streamlit-app-prod -p DEFAULT

What logs show:

  • [SYSTEM] - Deployment progress, file updates, dependency installation
  • [APP] - Application output (print statements, errors)
  • Backend connection status
  • Deployment IDs and timestamps
  • Stack traces for errors

Key log patterns to look for:

  • Deployment successful - Confirms deployment completed
  • App started successfully - App is running
  • Initialized real backend - Backend connected to Unity Catalog
  • Error: - Look for error messages and stack traces
  • 📝 Requirements installed - Dependencies loaded correctly

Cleanup

bash
1databricks bundle destroy -t dev 2databricks bundle destroy -t prod --auto-approve

Common Issues

IssueSolution
App deployment failsCheck logs: databricks apps logs <app-name> for error details
App not connecting to Unity CatalogCheck logs for backend connection errors; verify warehouse ID and permissions
Wrong permission levelDashboards: CAN_READ/RUN/EDIT/MANAGE; Jobs: CAN_VIEW/MANAGE_RUN/MANAGE
Path resolution failsUse ../src/ in resources/*.yml, ./src/ in databricks.yml
Catalog doesn't existCreate catalog first or update variable
"admins" group error on jobsCannot modify admins permissions on jobs
Volume permissionsUse grants not permissions for volumes
Hardcoded catalog in dashboardCreate environment-specific files or parameterize JSON
App not starting after deployApps require databricks bundle run <resource_key> to start
App env vars not workingEnvironment variables go in app.yaml (source dir), not databricks.yml
Wrong app source pathUse ../ from resources/ dir if source is in project root
Debugging any app issueFirst step: databricks apps logs <app-name> to see what went wrong

Key Principles

  1. Path resolution: ../src/ in resources/*.yml, ./src/ in databricks.yml
  2. Variables: Parameterize catalog, schema, warehouse
  3. Mode: development for dev/staging, production for prod
  4. Groups: Use "users" for all workspace users
  5. Job permissions: Verify custom groups exist; can't modify "admins"

Resources

Related Skills

Looking for an alternative to asset-bundles or building a Categories.community AI Agent? Explore these related open-source MCP Servers.

View All

widget-generator

Logo of f
f

widget-generator is an open-source AI agent skill for creating widget plugins that are injected into prompt feeds on prompts.chat. It supports two rendering modes: standard prompt widgets using default PromptCard styling and custom render widgets built as full React components.

149.6k
0
Design

chat-sdk

Logo of lobehub
lobehub

chat-sdk is a unified TypeScript SDK for building chat bots across multiple platforms, providing a single interface for deploying bot logic.

73.0k
0
Communication

zustand

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication

data-fetching

Logo of lobehub
lobehub

The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.

72.8k
0
Communication