How to Run the Connector Externally
To run the Ingestion via the UI you’ll need to use the OpenMetadata Ingestion Container, which comes shipped with custom Airflow plugins to handle the workflow deployment. If, instead, you want to manage your workflows externally on your preferred orchestrator, you can check the following docs to run the Ingestion Framework anywhere.External Schedulers
Get more information about running the Ingestion Framework Externally
Requirements
Delta Lake requires to run with Python 3.9 or 3.10. We do not yet support the Delta connector for Python 3.11Python Requirements
To run the Delta Lake ingestion, you will need to install:- If extracting from a metastore
- If extracting directly from the storage
Metadata Ingestion
All connectors are defined as JSON Schemas. Here you can find the structure to create a connection to Delta Lake. In order to create and run a Metadata Ingestion workflow, we will follow the steps to create a YAML configuration able to connect to the source, process the Entities if needed, and reach the OpenMetadata server. The workflow is modeled around the following JSON Schema1. Define the YAML Config
Source Configuration - From Metastore
Source Configuration - From Storage - S3
2. Run with the CLI
First, we will need to save the YAML file. Afterward, and with all requirements installed, we can run:dbt Integration
dbt Integration
Learn more about how to ingest dbt models