9 Comments
User's avatar
Johnny Winter's avatar

Deployed my first pipeline using DABs this week. It works a treat! One thing perhaps worth adding, the Databricks VSCode extension makes creating and testing DABs even easier (for those like me that are command line phobic)

Expand full comment
Chanukya's avatar

working with PyDaBs is also fun.. much better compared to databricks.yml

Expand full comment
Daniel Beach's avatar

Saying or writing anything with yml on the end gives me a headache

Expand full comment
Gabe's avatar

Curious, what is the main or most popular method of CI/CD on Databricks, if not DABs?

Expand full comment
Daniel Beach's avatar

For me it's bespoke simplicity. Simply bundle zip file of code and send to s3. Jobs reference zip file in one-liner. All CI/CD driven.

Simplicity wins the day.

Expand full comment
Knut Arne Smeland's avatar

I'd say probably Terraform, which is the underlying engine of dabs as well. I think if you are in a team already confident with tf, it's harder to make the case that you should switch to dabs

Expand full comment
Daniel Beach's avatar

Oh terraform. It's a love, hate relationship.

Expand full comment
Chanukya's avatar

Jobs/Workflows. One can write a python package whl for eg., make a CI to integrate to your workspace and CD to deploy to a volume and your workflow is set to run using Jobs API. It is same as JAR, notebook, dbt model etc.,

This has been matured solution for long way before DABs.

Expand full comment
Daniel Beach's avatar

This is actually the approach I take today. Compile zip, send to s3, send in jobs pointing to it. All done as part of CI/CD.

It's much simpler in my opinion.

Expand full comment