Schema-Flexible Configuration with JDBC (Apache Hive 6.0) External System

We've successfully configured an external system using Connection type: JDBC and DBMS: Apache Hive 6.0.

I’m looking for guidance on how to set up a Schema-Flexible configuration. Specifically:

  • Is it possible to use Schema-Flexible JDBC connections to support workflows that export data to multiple schemas?
  • For example, consider two databases:
    • Database A with tables T1 and T2
    • Database B with tables T3 and T4
      My workflow needs to export data into A.T1 and B.T4.
      Can this be achieved using a single Schema-Flexible JDBC external connection?

Any insights or examples would be greatly appreciated!