Run Glue Pipeline using the Airflow SDK
In this section, we provide guides and references to use the Glue connector.
Configure and schedule Glue metadata and profiler workflows from the OpenMetadata UI:
Requirements
OpenMetadata 0.12 or laterTo deploy OpenMetadata, check the Deployment guides.
The Glue connector ingests metadata through AWS Boto3 Client. We will ingest Workflows, its jobs and their run status.
The user must have the following permissions for the ingestion to run successfully:
glue:ListWorkflows
glue:GetWorkflow
glue:GetJobRuns
Python Requirements
To run the Glue ingestion, you will need to install:
Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Glue.
In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server.
The workflow is modeled around the following JSON Schema
1. Define the YAML Config
This is a sample config for Glue:
Source Configuration - Service Connection
awsAccessKeyId: Enter your secure access key ID for your Glue connection. The specified key ID should be authorized to read all databases you want to include in the metadata ingestion workflow.
awsSecretAccessKey: Enter the Secret Access Key (the passcode key pair to the key ID from above).
awsRegion: Enter the location of the amazon cluster that your data and account are associated with.
awsSessionToken: The AWS session token is an optional parameter. If you want, enter the details of your temporary session token.
endPointURL: Your Glue connector will automatically determine the AWS Glue endpoint URL based on the region. You may override this behavior by entering a value to the endpoint URL.
Source Configuration - Source Config
The sourceConfig
is defined here:
dbServiceNames: Database Service Name for the creation of lineage, if the source supports it.
includeTags: Set the 'Include Tags' toggle to control whether to include tags as part of metadata ingestion.
markDeletedPipelines: Set the Mark Deleted Pipelines toggle to flag pipelines as soft-deleted if they are not present anymore in the source system.
pipelineFilterPattern and chartFilterPattern: Note that the pipelineFilterPattern
and chartFilterPattern
both support regex as include or exclude.
Sink Configuration
To send the metadata to OpenMetadata, it needs to be specified as type: metadata-rest
.
Workflow Configuration
The main property here is the openMetadataServerConfig
, where you can define the host and security provider of your OpenMetadata installation.
For a simple, local installation using our docker containers, this looks like:
Workflow Configs for Security Provider
We support different security providers. You can find their definitions here.
Openmetadata JWT Auth
- JWT tokens will allow your clients to authenticate against the OpenMetadata server. To enable JWT Tokens, you will get more details here.
- You can refer to the JWT Troubleshooting section link for any issues in your JWT configuration. If you need information on configuring the ingestion with other security providers in your bots, you can follow this doc link.
2. Prepare the Ingestion DAG
Create a Python file in your Airflow DAGs directory with the following contents:
Import necessary modules
The Workflow
class that is being imported is a part of a metadata ingestion framework, which defines a process of getting data from different sources and ingesting it into a central metadata repository.
Here we are also importing all the basic requirements to parse YAMLs, handle dates and build our DAG.
Default arguments for all tasks in the Airflow DAG.
- Default arguments dictionary contains default arguments for tasks in the DAG, including the owner's name, email address, number of retries, retry delay, and execution timeout.
- config: Specifies config for the metadata ingestion as we prepare above.
- metadata_ingestion_workflow(): This code defines a function
metadata_ingestion_workflow()
that loads a YAML configuration, creates aWorkflow
object, executes the workflow, checks its status, prints the status to the console, and stops the workflow.
- DAG: creates a DAG using the Airflow framework, and tune the DAG configurations to whatever fits with your requirements
- For more Airflow DAGs creation details visit here.
Note that from connector to connector, this recipe will always be the same. By updating the YAML configuration
, you will be able to extract metadata from different sources.