Troubleshooting
Workflow Deployment Error
If there were any errors during the workflow deployment process, the Ingestion Pipeline Entity will still be created, but no workflow will be present in the Ingestion container.
- You can then Edit the Ingestion Pipeline and Deploy it again.
- From the Connection tab, you can also Edit the Service if needed.
Connector Debug Troubleshooting
This section provides instructions to help resolve common issues encountered during connector setup and metadata ingestion in OpenMetadata. Below are some of the most frequently observed troubleshooting scenarios.
How to Enable Debug Logging for Any Ingestion
To enable debug logging for any ingestion workflow in OpenMetadata:
Navigate to Services Go to Settings > Services > Service Type (e.g., Database) in the OpenMetadata UI.
Select a Service Choose the specific service for which you want to enable debug logging.

Select a Service
- Access Ingestion Tab Go to the Ingestion tab and click the three-dot menu on the right-hand side of the ingestion type, and select Edit.

Access Agents Tab
- Enable Debug Logging In the configuration dialog, enable the Debug Log option and click Next.

Enable Debug Logging
- Schedule and Submit Configure the schedule if needed and click Submit to apply the changes.

Schedule and Submit
Permission Issues
If you encounter permission-related errors during connector setup or metadata ingestion, ensure that all the prerequisites and access configurations specified for each connector are properly implemented. Refer to the connector-specific documentation to verify the required permissions.
Databricks Connection Details
Authentication Methods
Databricks connector supports three authentication methods. Choose the one that best fits your security requirements:
1. Personal Access Token (PAT)
2. Databricks OAuth (Service Principal)
3. Azure AD Setup
Getting Connection Details
Here are the steps to get hostPort
, httpPath
and authentication credentials:
First login to Azure Databricks and from side bar select SQL Warehouse (In SQL section)

Select Sql Warehouse
Now click on sql Warehouse from the SQL Warehouses list.

Open Sql Warehouse
Now inside that page go to Connection details section. In this page Server hostname and Port is your hostPort
, HTTP path is your http_path
.

Connection details
In Connection details section page click on Create a personal access token.

Open create token
Now In this page you can create new token
.

Generate token
Getting Service Principal Credentials
For Databricks OAuth (All Platforms)
- Navigate to your Databricks Account Console
- Go to Settings → Identity and access → Service Principals → Add Service Principal
- Note down the Application ID (this is your
clientId
) - Click Generate Secret and save the secret (this is your
clientSecret
)
For Azure AD Setup (Azure Databricks Only)
- Go to Azure Portal → Azure Active Directory
- Navigate to Microsoft Entra ID → App registrations → New registration
- After registration, note:
- Application (client) ID (this is your
azureClientId
) - Directory (tenant) ID (this is your
azureTenantId
)
- Application (client) ID (this is your
- Go to Certificates & secrets → New client secret
- Create and save the secret value (this is your
azureClientSecret
) - Navigate to your Azure Databricks Account Console
- Go to Settings → Identity and access → Service Principals → Add Service Principal
- Select Microsoft Entra ID managed option and enter your azureClientId
Common Issues
Authentication Failures
- PAT Issues: Ensure token hasn't expired (max 90 days lifetime)
- Service Principal: Verify the Service Principal has necessary permissions
- Azure AD: Check if Azure Databricks workspace is configured for Azure AD authentication
Permission Errors
- Ensure proper GRANT statements have been executed for your authentication method