How to Mount Azure Blob Storage Container to Azure Databricks File System

In the previous post How to Configure Azure KeyVault in Azure Databricks, we saw how your Azure Databrick’s workspace could be linked to an Azure KeyVault. In this post, we will see how azure blob storage container can be mounted without passing any storage key or shared access signatures. Instead, we will read the storage access key/SAS from KeyVault.

A secret with the name “storage-key” has been created in Azure KeyVault holding Shared Access Signature for our storage account.

storage-accesskey will be used to mount the storage in Azure databrick’s file system. Text highlighted behind yellow color is the storage name and “dbs” is the container in our azure blob storage account.

If you remember, in the previous post, we created the scope in databricks with the name “MyProjectScope” that we are using here to connect to our Azure KeyVault.

Once you execute the above code in the notebook, this will mount the storage container at the path mentioned under mount_point parameter i.e. /mnt/blobstorage/dbs. You can verify this as below by listing the files in the folder –

Generalized syntax for this is as follows –

  source = "wasbs://<container-name>@<storage-account-name>",
  mount_point = "/mnt/<mount-name>",
  extra_configs = {"<conf-key>":dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")})


  • <mount-name> is a DBFS path representing where the Blob storage container or a folder inside the container (specified in source) will be mounted in DBFS.
  • <conf-key> can be either<storage-account-name> or<container-name>.<storage-account-name>
  • dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>") gets the key that has been stored as a secret in secret-scope.

Note that conf-key is different based on whether you use the account key or shared access signature to authenticate storage account.

One comment

Leave a Reply