Skip to main content

DASL Prerequisites

The Databricks / Antimatter Security Lakehouse integrates with your Databricks environment. All of the data is stored in your workspace, and all of the processing happens in your workspace. As a consequence, you need several Databricks features enabled in the workspace you are going to use with DASL:

Unity Catalog

Unity Catalog must be enabled in your DASL workspace. All of the bronze, silver and gold tables that DASL will create will go into a UC catalog that you configure during the Installation Notebook.

Serverless SQL Warehouse

When interacting with the DASL user interface, a SQL warehouse is used to provide the data seen in the UI. Some of the reporting features of DASL also use the Serverless SQL Warehouse for background tasks. The Installation Notebook will automatically create a small SQL Warehouse for use with DASL so that you have separate cost attribution and control.

Serverless Job Compute

The jobs that DASL creates leverage serverless job compute for lower latency and more predictable performance as your data scales up. In a future release of DASL we may include support for Classic Compute, but this is not supported for Private Preview.

Installation by Account Admin

Most of the steps during DASL installation must be taken by an Account Admin. DASL can be used by standard workspace users after the installation is complete, as long as they have appropriate permissions on the Unity Catalog schemas.