InvokeAI/invokeai
psychedelicious 7ffaa17551 fix(ui): use prompt bug when prompt has colon
This bug is related to the format in which we stored prompts for some time: an array of weighted subprompts.

This caused some strife when recalling a prompt if the prompt had colons in it, due to our recently introduced handling of negative prompts.

Currently there is no need to store a prompt as anything other than a string, so we revert to doing that.

Compatibility with structured prompts is maintained via helper hook.
2023-02-22 20:33:58 +11:00
..
assets Various fixes 2023-01-30 18:42:17 -05:00
backend fix(ui): use prompt bug when prompt has colon 2023-02-22 20:33:58 +11:00
configs tweak initial model descriptions 2023-02-05 23:23:09 -05:00
frontend fix(ui): use prompt bug when prompt has colon 2023-02-22 20:33:58 +11:00
__init__.py Various fixes 2023-01-30 18:42:17 -05:00
README Various fixes 2023-01-30 18:42:17 -05:00

After version 2.3 is released, the ldm/invoke modules will be migrated to this location
so that we have a proper invokeai distribution. Currently it is only being used for
data files.