mirror of
https://github.com/invoke-ai/InvokeAI
synced 2024-08-30 20:32:17 +00:00
b76d2cd716
We had a one-behind issue with recalling metadata items that had a model. For example, when recalling LoRAs, we check against the current main model to decide whether or not the requested LoRA is compatible and may be recalled. When recalling all params, we are often also recalling the main model, but the compat logic didn't compare against this new main model. The logic is updated to check against the new main model, if one is being set. Closes #5512 |
||
---|---|---|
.. | ||
CLI | ||
install | ||
merge | ||
training | ||
web | ||
__init__.py | ||
legacy_launch_invokeai.py |