feat(mm): support customizable cache key for load_model_from_path()
#7987
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
When loading/caching a model, we use its file path as the cache key.
Some models are initialized with parameters, so the path is not a unique identifier for the model. For these models, the cache key needs to have some other unique data in it.
This PR adds a
cache_key_extra: str | None = None
arg to the internal model managerload_model_from_path()
method. It also adds the arg to the public invocation context methods that wrap the internal method.If provided, the string is appended to the model's path as its cache key.
Alternatives to this approach include:
Hashable
. Use the hashable as the cache key when provided. Or create a new object that includes the model path and extra hashable, and use that as the key. This approach may require deeper changes, because there may be something that relies on the cache keys being strings.I'm sure there are other strategies that work.
Out of the 4 approaches (this PR + the alternatives), I like
Alternative 1
the most. However, I haven't reviewed the cache logic thoroughly yet and am not confident that changing the key to a hashable will not break something. It'll probably be fine - maybe require minor changes - just I haven't investigated yet.Related Issues / Discussions
https://discord.com/channels/1020123559063990373/1361450510812450907
QA Instructions
There are no functional changes to existing code paths/nodes. This is purely in support of community nodes that want to load models and take advantage of the model manager's caching.
Merge Plan
n/a
Checklist
What's New
copy (if doing a release after this PR)