Run Domo Pipeline using the Airflow SDK
In this section, we provide guides and references to use the Domo-Pipeline connector.
Configure and schedule Domo-Pipeline metadata and profiler workflows from the OpenMetadata UI:
RequirementsOpenMetadata 0.12 or later
To deploy OpenMetadata, check the Deployment guides.
To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.
Note: For metadata ingestion, kindly make sure add alteast
data scopes to the clientId provided. Question related to scopes, click here.
To run the domopipeline ingestion, you will need to install:
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Airbyte.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema
1. Define the YAML Config
This is a sample config for Domo-Pipeline:
Source Configuration - Service Connection
Client ID: Client ID to Connect to DOMO Pipeline.
Secret Token: Secret Token to Connect DOMO Pipeline.
Access Token: Access to Connect to DOMO Pipeline.
API Host: API Host to Connect to DOMO Pipeline instance.
SandBox Domain: Connect to SandBox Domain.
Source Configuration - Source Config
sourceConfig is defined here:
dbServiceNames: Database Service Name for the creation of lineage, if the source supports it.
includeTags: Set the 'Include Tags' toggle to control whether to include tags as part of metadata ingestion.
markDeletedPipelines: Set the Mark Deleted Pipelines toggle to flag pipelines as soft-deleted if they are not present anymore in the source system.
pipelineFilterPattern and chartFilterPattern: Note that the
chartFilterPattern both support regex as include or exclude.
To send the metadata to OpenMetadata, it needs to be specified as
The main property here is the
openMetadataServerConfig, where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
Workflow Configs for Security Provider
We support different security providers. You can find their definitions here.
Openmetadata JWT Auth
- JWT tokens will allow your clients to authenticate against the OpenMetadata server. To enable JWT Tokens, you will get more details here.
- You can refer to the JWT Troubleshooting section link for any issues in your JWT configuration. If you need information on configuring the ingestion with other security providers in your bots, you can follow this doc link.
2. Prepare the Ingestion DAG
Create a Python file in your Airflow DAGs directory with the following contents:
Import necessary modules
Workflow class that is being imported is a part of a metadata ingestion framework, which defines a process of getting data from different sources and ingesting it into a central metadata repository.
Here we are also importing all the basic requirements to parse YAMLs, handle dates and build our DAG.
Default arguments for all tasks in the Airflow DAG.
- Default arguments dictionary contains default arguments for tasks in the DAG, including the owner's name, email address, number of retries, retry delay, and execution timeout.
- config: Specifies config for the metadata ingestion as we prepare above.
- metadata_ingestion_workflow(): This code defines a function
metadata_ingestion_workflow()that loads a YAML configuration, creates a
Workflowobject, executes the workflow, checks its status, prints the status to the console, and stops the workflow.
- DAG: creates a DAG using the Airflow framework, and tune the DAG configurations to whatever fits with your requirements
- For more Airflow DAGs creation details visit here.
Note that from connector to connector, this recipe will always be the same. By updating the
YAML configuration, you will be able to extract metadata from different sources.