Spark and Databricks Runtime Version

You can check the version of Spark running on the cluster your notebook is attached to as follows –

… and to check the Databricks Runtime version, run the following command –

It can be retrieved using Scala as –

You might be wondering why someone would need to get the version programmatically in a notebook. Note that many API calls require to specify Databricks Runtime version string and that’s the place where it might help.