Databricks Credentials
Set up Databricks Credentials
This allows Osmos AI Data Engineers to access your Databricks instance safely while respecting the catalog and workspace access defined by your Databricks admin.
To enable Osmos AI Data Engineers to build, run, and validate Spark workflows in your Databricks environment, you’ll need to provide securely configured credentials. Follow these steps to connect your Databricks workspace to Osmos.
Before You Begin
You must be an Account Admin in Databricks.
Access to both the Databricks Account Console
Create a Service Principal (M2M) Credential in Databricks
Access to Osmos is required.
Step 1: Log in to Osmos and select Credentials

Step 2: Set up your Databricks Credential for Osmos
Osmos currently supports Databricks' Service Principal (M2M) for Auth.

1. Enter the Osmos Credential Name
This is the credential name you will see in Osmos. e.g.,
osmos-service-agent
.It is recommended to set this to the same as the Service Principal name in Databricks.
2. Enter the Client Secret
The Client Secret is copied from the Databricks Account Console > Service Principals
3. Enter the Client ID
The Client ID is also copied from the Databricks Account Console > Service Principals.
4. Select Save Credential

Once configured, Osmos AI Data Engineers can securely access and run workflows in your Databricks environment while honoring your admin-defined access controls.
Last updated
Was this helpful?