This page is about running the Ingestion Framework externally!
There are mainly 2 ways of running the ingestion:
- Internally, by managing the workflows from OpenMetadata.
- Externally, by using any other tool capable of running Python code.
If you are looking for how to manage the ingestion process from OpenMetadata, you can follow this doc.
Run the ingestion from AWS MWAA
When running ingestion workflows from MWAA we have three approaches:
- Install the openmetadata-ingestion package as a requirement in the Airflow environment. We will then run the process using a
PythonOperator
- Configure an ECS cluster and run the ingestion as an
ECSOperator
. - Install a plugin and run the ingestion with the
PythonVirtualenvOperator
.
We will now discuss pros and cons of each aspect and how to configure them.
Ingestion Workflows as a Python Operator
PROs
- It is the simplest approach
- We don’t need to spin up any further infrastructure
CONs
- We need to install the openmetadata-ingestion package in the MWAA environment
- The installation can clash with existing libraries
- Upgrading the OM version will require to repeat the installation process
To install the package, we need to update the requirements.txt
file from the MWAA environment to add the following line:
Where x.y.z
is the version of the OpenMetadata ingestion package. Note that the version needs to match the server version. If we are using the server at 1.3.1, then the ingestion package needs to also be 1.3.1.
The plugin parameter is a list of the sources that we want to ingest. An example would look like this openmetadata-ingestion[mysql,snowflake,s3]==1.3.1
.
A DAG deployed using a Python Operator would then look like follows
Where you can update the YAML configuration and workflow classes accordingly. accordingly. Further examples on how to run the ingestion can be found on the documentation (e.g., Snowflake).
Ingestion Workflow classes
We have different classes for different types of workflows. The logic is always the same, but you will need to change your import path. The rest of the method calls will remain the same.
For example, for the Metadata
workflow we'll use:
The classes for each workflow type are:
Metadata
:from metadata.workflow.metadata import MetadataWorkflow
Lineage
:from metadata.workflow.metadata import MetadataWorkflow
(same as metadata)Usage
:from metadata.workflow.usage import UsageWorkflow
dbt
:from metadata.workflow.metadata import MetadataWorkflow
Profiler
:from metadata.workflow.profiler import ProfilerWorkflow
Data Quality
:from metadata.workflow.data_quality import TestSuiteWorkflow
Data Insights
:from metadata.workflow.data_insight import DataInsightWorkflow
Elasticsearch Reindex
:from metadata.workflow.metadata import MetadataWorkflow
(same as metadata)
Ingestion Workflows as an ECS Operator
PROs
- Completely isolated environment
- Easy to update each version
CONs
- We need to set up an ECS cluster and the required policies in MWAA to connect to ECS and handle Log Groups.
We will now describe the steps, following the official AWS documentation.
1. Create an ECS Cluster & Task Definition
- The cluster needs a task to run in
FARGATE
mode. - The required image is
docker.getcollate.io/openmetadata/ingestion-base:x.y.z
- The same logic as above applies. The
x.y.z
version needs to match the server version. For example,docker.getcollate.io/openmetadata/ingestion-base:1.3.1
- The same logic as above applies. The
We have tested this process with a Task Memory of 512MB and Task CPU (unit) of 256. This can be tuned depending on the amount of metadata that needs to be ingested.
When creating the Task Definition, take notes on the log groups assigned, as we will need them to prepare the MWAA Executor Role policies.
For example, if in the JSON from the Task Definition we see:
We'll need to use the /ecs/openmetadata
below when configuring the policies.
2. Task Definition ARN & Networking
- From the AWS Console, copy your task definition ARN. It will look something like this
arn:aws:ecs:<region>:<account>:task-definition/<name>:<revision>
. - Get the network details on where the task should execute. We will be using a JSON like:
If you want to extract MWAA metadata, add the VPC, subnets and security groups used when setting up MWAA. We need to be in the same network environment as MWAA to reach the underlying database.
3. Update MWAA Executor Role policies
- Identify your MWAA executor role. This can be obtained from the details view of your MWAA environment.
- Add the following two policies to the role, the first with ECS permissions:
And for the Log Group permissions
Note how you need to replace the region
, account-id
and the log group
names for your Airflow Environment and ECS.
4. Prepare the DAG
A DAG created using the ECS Operator will then look like this:
Note that depending on the kind of workflow you will be deploying, the YAML configuration will need to updated following the official OpenMetadata docs, and the value of the pipelineType
configuration will need to hold one of the following values:
metadata
usage
lineage
profiler
TestSuite
Which are based on the PipelineType
JSON Schema definitions
Moreover, one of the imports will depend on the MWAA Airflow version you are using:
- If using Airflow < 2.5:
from airflow.providers.amazon.aws.operators.ecs import ECSOperator
- If using Airflow > 2.5:
from airflow.providers.amazon.aws.operators.ecs import EcsRunTaskOperator
Make sure to update the ecs_operator_task
task call accordingly.
Ingestion Workflows as a Python Virtualenv Operator
PROs
- Installation does not clash with existing libraries
- Simpler than ECS
CONs
- We need to install an additional plugin in MWAA
- DAGs take longer to run due to needing to set up the virtualenv from scratch for each run.
We need to update the requirements.txt
file from the MWAA environment to add the following line:
Then, we need to set up a custom plugin in MWAA. Create a file named virtual_python_plugin.py. Note that you may need to update the python version (eg, python3.7 -> python3.10) depending on what your MWAA environment is running.
This is modified from the AWS sample.
Next, create the plugins.zip file and upload it according to AWS docs. You will also need to disable lazy plugin loading in MWAA.
A DAG deployed using the PythonVirtualenvOperator would then look like:
Where you can update the YAML configuration and workflow classes accordingly. accordingly. Further examples on how to run the ingestion can be found on the documentation (e.g., Snowflake).
You will also need to determine the OpenMetadata ingestion extras and Airflow providers you need. Note that the Openmetadata version needs to match the server version. If we are using the server at 0.12.2, then the ingestion package needs to also be 0.12.2. An example of the extras would look like this openmetadata-ingestion[mysql,snowflake,s3]==0.12.2.2
. For Airflow providers, you will want to pull the provider versions from the matching constraints file. Since this example installs Airflow Providers v2.4.3 on Python 3.7, we use that constraints file.
Also note that the ingestion workflow function must be entirely self-contained as it will run by itself in the virtualenv. Any imports it needs, including the configuration, must exist within the function itself.
Ingestion Workflow classes
We have different classes for different types of workflows. The logic is always the same, but you will need to change your import path. The rest of the method calls will remain the same.
For example, for the Metadata
workflow we'll use:
The classes for each workflow type are:
Metadata
:from metadata.workflow.metadata import MetadataWorkflow
Lineage
:from metadata.workflow.metadata import MetadataWorkflow
(same as metadata)Usage
:from metadata.workflow.usage import UsageWorkflow
dbt
:from metadata.workflow.metadata import MetadataWorkflow
Profiler
:from metadata.workflow.profiler import ProfilerWorkflow
Data Quality
:from metadata.workflow.data_quality import TestSuiteWorkflow
Data Insights
:from metadata.workflow.data_insight import DataInsightWorkflow
Elasticsearch Reindex
:from metadata.workflow.metadata import MetadataWorkflow
(same as metadata)