How to Configure Azure KeyVault in Azure Databricks

Below are the things you would need to keep handy while configuring the Azure keyvault in Azure Databricks –

Azure Databricks Host: URL in the snapshot below is the Azure databricks host ID

Azure KeyVault DNS Name and Azure KeyVault Resource ID

Then append #secrets/createScope to your Azure databricks host URL and open this in a new browser tab/window. This would look like this –

After ?o but before #secrets in above URL is your workspace id.

Note that, #secrets/createScope is case-sensitive so you need to append this to URL as shown here. In the form, specify the scope name and enter KeyValut DNS name and Resource ID as identified in previous steps.

Manage principal indicates who can manage this scope…only the creator or all users in the workspace.

Once you create this, you get the following screen which indicates that scope has been created.

If you have databricks CLI installed on your machine, you can verify the scope created in the previous step as –

And if CLI is not installed on your machine, you may visit another post to see how to install and configure databricks CLI.

Note that the scopes can be created with “Creator” manage principal only under Premium tier in Azure Databricks. Otherwise, you see the following error

If you are using Standard tier then you need to use “All Users” option which allows all users to manage your scope

Your Azure KeyVault is now associated with a scope in Azure Databricks that you can further navigate to access the secrets in your notebook code. In the next post, In the next post How to Mount Azure Blob Storage Container to Azure Databricks File System, you can see how to use secrets stored in Azure KeyVault to mount Azure Blob Storage.


Leave a Reply