Skip to main content

Announcing the General Availability of Databricks Notebooks on SQL Warehouses

Jason Messer
Jackie Zhang
Miranda Luna
Share this post

Today, we are excited to announce the general availability of Databricks Notebooks on SQL warehouses. Databricks SQL warehouses are SQL-optimized compute that provide up to 12x improved price performance over standard interactive clusters. You can also take advantage of the notebook to write and schedule Git-backed, multi-statement, and parameterized SQL. 

Databricks SQL Momentum

Over the last few years, we've seen tremendous growth and adoption of Databricks SQL - our data warehouse purpose-built for the Lakehouse. DBSQL is helping leading companies like Akamai, T-Mobile, and CRED drive innovation by powering modern analytics use cases around the globe - at any scale. Notebooks on SQL warehouses improve their experience, giving data practitioners the flexibility to use the powerful Databricks Notebook. 

Love the ability to use both all-purpose compute to work with any supported language as well as SQL Serverless Warehouses for SQL only workloads, all within the same dev experience - Josue A. Bogran, Solutions Architect Manager at Kythera Labs 

Build SQL projects with Databricks Notebooks

 Use Widgets and Parameters  

Quickly explore the result of a query with auto-populating parameters. Simply type `:parameter_name` and then enter a value for the parameter in the input box. Learn more about widgets for notebooks on SQL warehouses.

Multiple Result Statements 

When running multiple SQL statements at once,  you can compare the output from each statement with multiple result statements in the notebook. 

Native Integration with Workflows 

With notebooks native workflows integration, you can easily take exploratory analysis to production by scheduling an analysis as a recurring event with fully managed orchestration. Additionally, you can run tasks conditionally with if/else statements, ensuring that only relevant tasks are executed based on data conditions or job states.

Native Git-Backed Development  

Notebooks have native integration with Databricks Git folders, allowing you to develop code in notebooks or other files and follow data science and engineering code development best practices using Git for version control, collaboration, and CI/CD.

Interactive Document-Oriented Development  

Take advantage of iterative document-style development with the notebook to uncover insights and tell stories. The notebook allows you to add comments, build visualizations,  and chain together SQL statements using temporary views.

Using Notebooks on SQL warehouses has significantly enhanced our data analysis process. The ability to execute multiple SQL statements using a single cell has improved the way we analyze, compare, and troubleshoot information. This, in turn, has allowed us to provide our Marketing Team with faster and more effective solutions. - Favio Varillas Ramos, Data Engineer at Yape Peru

Get Started on Notebooks on SQL

Notebooks on SQL warehouses are available in the product today. Simply create a notebook and, in the compute dropdown, select a Pro or Serverless SQL warehouse. If you have any feedback or questions, feel free to contact us at  [email protected]. Check out the documentation page for more detailed resources on getting started with Notebooks on SQL warehouses.

Try Databricks for free

Related posts

See all Platform & Products & Announcements posts