-
Notifications
You must be signed in to change notification settings - Fork 80
Description
Until now (textx 3.0), we have the ImportURI scope provider base class which allows to use the special attribute "importURI" to load imported models: e.g., see this example.
Specializations of ImportURI (scope providers allowing to import models):
- http://textx.github.io/textX/3.0/scoping/#scope-providers-defined-in-module-textxscopingproviders
- http://textx.github.io/textX/3.0/rrel/#rrel-and-multi-files-model
How does the base class ImportURI work:
- when loading a model, prior to reference resolution, all imported models are loaded. This happens in
model.pyby checking if the scope providerìs aModelLoaderinstance: here and here. - The
ImportURIhas a specific way to detect imports. It scans the model for attributes namedimportURIto load models (this could be implemented in a different way, e.g., by using other attributes, but we do not do this any unittest). - The
ImportURIhas a specific way to load models. They are stored in some special `_tx"-atttributes of the model (or the metamodel). This "special way" of loading models is used by our multi model scope providers or tool functions (see, e.g, http://textx.github.io/textX/3.0/scoping/#included-model-retrieval) but could be implemented differently.
I would like to start a discussion if we should narrow the possibilities to load models into the model (here and here.):
- instead to call all
ModelLoaderscope provider separately (which redundantly tries to load and reload all imported models if more than one such scope provider is registered) - we could just once find all
importURIattributes (fixed logic) and load all imported models (like we do it now in theImportURIscope provider). - All scope providers have then to be analyzed if they should lookup cross-model references by default or always (e.g.
+min RREL strings or theFQN-scope provider).
This a BIC in any case! We should carefully think about it some time... But I think, if done correctly, this BIC should have no (or nearly no) impact for "normal users".
I am looking forward to discuss this topic with the users of textx! We can also move it to the discussion section, but since I expect some actions following this discussion I started it as issue... @igordejanovic feel free to move this issue...