Data Insights
Platform adoption is an important element for teams implementing OpenMetadata. With the data insights feature organization can drive the adoption of OpenMetadata by monitoring its usage and setting up company wide KPIs.
Data Insight Reports
OpenMetadata offers a suite of reports providing platform analytics around specific areas.
Data Assets
The Data Assets reports display important metrics around your data assets in OpenMetadata.
Total Data Assets
This chart represents the total number of data assets present in OpenMetadata. It offers a view of your data assets broken down by asset type (i.e. Database, Table, ML Model, etc.)

Total Data Assets Chart
Percentage of Data Assets with Description
This chart represents the percentage of data assets present in OpenMetadata with a description. For Table asset type, this condition is true only if the table and column description are filed. It allows you to quickly view the description coverage for your data assets in OpenMetadata.

Percentage of Assets with Description
Percentage of Data Assets with Owners
This chart represents the percentage of data assets present in OpenMetadata with an owner assigned. Data assets that do not support assigning an owner will not be counted in this percentage. It allows you to quickly view the ownership coverage for your data assets in OpenMetadata.

Percentage of Assets with Owner Assigned
Total Data Assets by Tier
This chart represents a broken down view of data assets by Tiers. Data Assets with no tiers assigned are not included in this. It allows you to quickly view the breakdown of data assets by tier.

Data Asset by Tier
App Analytics
The App Analytics report provides important metrics around the usage of OpenMetadata.
Most Viewed Data Assets
This chart shows the top 10 data assets the most viewed in your platform. It offers a quick view to understand what are the data assets with the most interest in your organization.

Most Viewed Assets
Page views by data assets
This chart shows the total number of page views by asset type. This allows you to understand which asset familly drives the most interest in your organization

Page Views by Assets
Daily active users on the platform
This chart shows the number of daily active users on your platform. Active users are users with at least one session. This report allows to understand the platform usage and see how your organization leverage OpenMetadata.

Daily Active Users
Most Active Users
This chart shows the top 10 most active users. These users are your power users in your organization. They can be turned into evangelist to promote OpenMetadata inside your company.

Daily Active Users
Setting up Data Insight Workflow
Step 1
Navigate to settings > OpenMetadata > Data Insights
.

DataInsights Pipeline Page
On the Data Insights Page
click on Add Data Insight Ingestion
Step 2
Pick a name for your ingestion workflow or leave it as is.

Data Insight Ingestion Name
Add any elasticsearch configuration relevant to your setup. Note that if you are deploying OpenMetadata with no custom elasticsearch deployment you can skip this configuration step.

Data Insight Ingestion ES Config
Choose a schedule exection time for your workflow. The schedule time is displayed in UTC. We recommend to run this workflow overnight or when activity on the platform is at its lowest to ensure accurate data.

Data Insight Ingestion Schedule
Step 3
Navigate to the Insights
page. You should see your data insights reports. Note that if you have just deployed OpenMetadata, App Analytic
data might not be present. App Analytic
data are fetched from the previous day (UTC).
Data Insight KPIs
While data insights reports gives an analytical view of OpenMetadata platform, KPIs are here to drive platform adoption.

Data Insight KPI
KPIs Categories
Completed Description
Available as an absolute or relative (percentage) value, this KPI measures the description coverage of your data assets in OpenMetadata.
Completed Ownership
Available as an absolute or relative (percentage) value, this KPI measures the ownershi[] coverage of your data assets in OpenMetadata.
Adding KPIs
On the Insights
page, click on Add KPI
. This will open the KPI configuration page where the following required configuration elements need to be set:
Name
: name of your KPISelect a chart
: this links the KPI to one of the chart present in the data insight reportsSelect a metric type
: you can choose betweenPERCENTAGE
orNUMBER
. The former will be a relative value while the latter an absolute valueStart date
/End date
: this will determine the start and end date of your KPI. It sets an objective for your organization

KPI Configuration
Run Data Insights using the Airflow SDK
1. Define the YAML Config
This is a sample config for Data Insights:
Source Configuration - Source Config
- To send the metadata to OpenMetadata, it needs to be specified as
type: MetadataToElasticSearch
.
processor Configuration
- To send the metadata to OpenMetadata, it needs to be specified as
type: data-insight-processor
.
Workflow Configuration
The main property here is the openMetadataServerConfig
, where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
We support different security providers. You can find their definitions here. You can find the different implementation of the ingestion below.
2. Prepare the Data Insights DAG
Create a Python file in your Airflow DAGs directory with the following contents:
Run Data Insights using the metadata CLI
1. Define the YAML Config
This is a sample config for Data Insights:
Source Configuration - Source Config
- To send the metadata to OpenMetadata, it needs to be specified as
type: MetadataToElasticSearch
.
processor Configuration
- To send the metadata to OpenMetadata, it needs to be specified as
type: data-insight-processor
.
Workflow Configuration
The main property here is the openMetadataServerConfig
, where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
We support different security providers. You can find their definitions here. You can find the different implementation of the ingestion below.
2. Run with the CLI
First, we will need to save the YAML file. Afterward, and with all requirements installed, we can run:
Run Elasticsearch Reindex using the Airflow SDK
1. Define the YAML Config
This is a sample config for Elasticsearch Reindex:
2. Prepare the Ingestion DAG
Create a Python file in your Airflow DAGs directory with the following contents: