Is it possible to configure two DLT pipelines to target same schema, having in mind that each will write to different tables? Can it produce some issues regarding consistency of metadata DLT pipelines writes?
If you're publishing into a schema in the Hive Metastore, then the tables are just registered there by creating an external table (like, create table ... location '...'
), no actual data is stored in the schema. Checkpoints & data are stored under the storage location you configured. So it's possible to have multiple pipelines writing into the same Hive Metastore schema.
If you publish to the schema in the Unity Catalog, then it also allows to have two pipelines publishing to the same schema, but it's not explicitly mentioned in the documentation, so theoretically it may change in the future.