Run Domo Pipeline using the Airflow SDK

In this section, we provide guides and references to use the Domo-Pipeline connector.

Configure and schedule Domo-Pipeline metadata and profiler workflows from the OpenMetadata UI:

To run the Ingestion via the UI you'll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment.

Note

For metadata ingestion, kindly make sure add alteast data scopes to the clientId provided. Question related to scopes, click here.

To run the domopipeline ingestion, you will need to install:

pip3 install "openmetadata-ingestion[domo]"

All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Airbyte.

In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.

The workflow is modeled around the following JSON Schema

This is a sample config for Domo-Pipeline:

source:
  type: domopipeline
  serviceName: domo-pipeline_source
  serviceConnection:
    config:
      type: DomoPipeline
      clientID: clientid
      secretToken: secret-token
      accessToken: access-token
      apiHost: api.domo.com
      sandboxDomain: https://<api_domo>.domo.com
  sourceConfig:
    config:
      type: PipelineMetadata
      # pipelineFilterPattern:
      #   includes:
      #     - pipeline1
      #     - pipeline2
      #   excludes:
      #     - pipeline3
      #     - pipeline4
sink:
  type: metadata-rest
  config: {}
workflowConfig:
  # loggerLevel: DEBUG  # DEBUG, INFO, WARN or ERROR
  openMetadataServerConfig:
    hostPort: http://localhost:8585/api
    authProvider: <OpenMetadata auth provider>
    securityconfig:
    jwtToken:
   

Source Configuration - Service Connection

  • Client ID: Client ID to Connect to DOMO Pipeline.
  • Secret Token: Secret Token to Connect DOMO Pipeline.
  • Access Token: Access to Connect to DOMO Pipeline.
  • API Host: API Host to Connect to DOMO Pipeline instance.
  • SandBox Domain: Connect to SandBox Domain.

Source Configuration - Source Config

The sourceConfig is defined here:

  • dbServiceNames: Database Service Name for the creation of lineage, if the source supports it.
  • pipelineFilterPattern and chartFilterPattern: Note that the pipelineFilterPattern and chartFilterPattern both support regex as include or exclude. E.g.,
pipelineFilterPattern:
  includes:
    - users
    - type_test

Sink Configuration

To send the metadata to OpenMetadata, it needs to be specified as type: metadata-rest.

Workflow Configuration

The main property here is the openMetadataServerConfig, where you can define the host and security provider of your OpenMetadata installation.

For a simple, local installation using our docker containers, this looks like:

workflowConfig:
  openMetadataServerConfig:
    hostPort: 'http://localhost:8585/api'
    authProvider: openmetadata
    securityConfig:
      jwtToken: '{bot_jwt_token}'

We support different security providers. You can find their definitions here. You can find the different implementation of the ingestion below.

chevron_rightConfigure SSO in the Ingestion Workflows

Create a Python file in your Airflow DAGs directory with the following contents:

import pathlib
import yaml
from datetime import timedelta
from airflow import DAG

try:
    from airflow.operators.python import PythonOperator
except ModuleNotFoundError:
    from airflow.operators.python_operator import PythonOperator

from metadata.config.common import load_config_file
from metadata.ingestion.api.workflow import Workflow
from airflow.utils.dates import days_ago

default_args = {
    "owner": "user_name",
    "email": ["username@org.com"],
    "email_on_failure": False,
    "retries": 3,
    "retry_delay": timedelta(minutes=5),
    "execution_timeout": timedelta(minutes=60)
}

config = """
<your YAML configuration>
"""

def metadata_ingestion_workflow():
    workflow_config = yaml.safe_load(config)
    workflow = Workflow.create(workflow_config)
    workflow.execute()
    workflow.raise_from_status()
    workflow.print_status()
    workflow.stop()

with DAG(
    "sample_data",
    default_args=default_args,
    description="An example DAG which runs a OpenMetadata ingestion workflow",
    start_date=days_ago(1),
    is_paused_upon_creation=False,
    schedule_interval='*/5 * * * *',
    catchup=False,
) as dag:
    ingest_task = PythonOperator(
        task_id="ingest_using_recipe",
        python_callable=metadata_ingestion_workflow,
    )

Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration, you will be able to extract metadata from different sources.

Still have questions?

You can take a look at our Q&A or reach out to us in Slack

Was this page helpful?

editSuggest edits