Integrations
MetaKraftwerk supports various data integration platforms and technologies, each offering different capabilities for ETL and ELT processes. This section provides an overview of supported integrations and their specific characteristics.
ETL vs. ELT
Patterns exist for both ETL and ELT approaches, but the benefit is the same for both: they establish a standard for processes and pipelines within a data platform.
ETL (Extract, Transform, Load) Patterns: Patterns for an ETL technology (such as Informatica or Azure Data Fabric) define processes within these ETL platforms, where at least one pipeline including at least one data source and one data target is included in the pattern. However, a pattern can also consist of multiple pipelines and/or data objects.
ELT (Extract, Load, Transform) Patterns: Patterns for an ELT technology (such as Snowflake, Databricks, or Azure Synapse Analytics) focus on loading raw data first and then applying transformations within the target data warehouse or data lake (SQL/Notebooks/Jobs). Patterns for ELT often include DDL/SQL templates, orchestration definitions and notebook or job templates rather than tool-internal mappings.
General rules when designing patterns for either approach:
- Keep placeholders consistent (e.g.
${SRC_SCHEMA},${TARGET_TABLE},${CONNECTION}). - Use the export format expected from the development platform (XML, JSON, notebook, SQL, script).
- Provide example instance metadata and at least one test instantiation that is runnable on a dev environment.
Supported / typical integrations
Below is a list of platforms commonly used with MetaKraftwerk. The list corresponds to the selection you see in the UI:

Informatica Intelligent Data Management Cloud (IDMC)
Cloud ETL/IDMC packages and task definitions. Patterns for IDMC typically contain cloud-appropriate connection placeholders and runtime parameters.Databricks (DB)
Databricks is an ELT/execution platform (not a pure warehouse). Patterns for Databricks often include notebooks or jobs (Python/Scala/SQL) and orchestration templates. Provide a clear mapping between instance metadata and notebook parameters.Snowflake (SF)
Cloud data-warehouse / ELT target. Patterns usually deliver DDL templates, copy/load scripts, stored procedures, and transformation SQL. Parameterise schema names, file locations, stages and role/warehouse settings.Azure Data Factory (ADF)
Orchestration / pipeline service. Patterns here include pipeline templates (JSON ARM/ADF templates), Spark notebooks, linked service placeholders and activity parameterisation.Azure Synapse Analytics (ASA)
Synapse pipelines, SQL scripts, and Spark notebooks — used both as ELT and orchestration platform. Patterns for ASA should indicate whether compute is SQL pools/Spark and parameterise accordingly.Microsoft Fabric (MFB / Fabric)
Fabric-related artifacts can include pipelines, notebooks and Lakehouse artifacts. Patterns should document target workspace, lake paths and compute configuration.Informatica PowerCenter (PC)
Traditional on-prem ETL platform. Patterns typically include PowerCenter folder exports (mappings/workflows). Use placeholders for connection names, folder names and target object names.Informatica Data Engineering Integration (DEI) (Developer / BDM)
Developer-based ETL artifacts (mapping/pipeline definitions). The Export/import format is XML.Talend Open Studio (TOS)
Talend jobs and templates; patterns contain job templates plus parameter maps for environment-specific connections.Oracle Data Integrator (ODI) ETL patterns for the Oracle Data Integrator (ODI) platform.