Databricks Credentials

Set up Databricks Credentials

This allows Osmos AI Data Engineers to access your Databricks instance safely while respecting the catalog and workspace access defined by your Databricks admin.

To enable Osmos AI Data Engineers to build, run, and validate Spark workflows in your Databricks environment, you’ll need to provide securely configured credentials. Follow these steps to connect your Databricks workspace to Osmos.

Before You Begin

  • You must be an Account Admin in Databricks.

  • Access to both the Databricks Account Console

  • Create a Service Principal (M2M) Credential in Databricks

  • Access to Osmos is required.

Step 1: Log in to Osmos and select Credentials

Step 2: Set up your Databricks Credential for Osmos

Osmos currently supports Databricks' Service Principal (M2M) for Auth.

1. Enter the Osmos Credential Name

  1. This is the credential name you will see in Osmos. e.g., osmos-service-agent.

  2. It is recommended to set this to the same as the Service Principal name in Databricks.

2. Enter the Client Secret

  1. The Client Secret is copied from the Databricks Account Console > Service Principals

3. Enter the Client ID

  1. The Client ID is also copied from the Databricks Account Console > Service Principals.

4. Select Save Credential

Once configured, Osmos AI Data Engineers can securely access and run workflows in your Databricks environment while honoring your admin-defined access controls.

Stay tuned—interactive user OAuth (U2M) support is coming soon, but currently only M2M (Service Principal) is supported.

Last updated

Was this helpful?