psychedelicious
a901a37433
feat(ui): improve no loaded loras UI
2023-07-06 22:26:54 +10:00
psychedelicious
e09c07a97d
fix(ui): fix board auto-add
2023-07-06 22:25:05 +10:00
psychedelicious
87feae959d
feat(ui): improve no loaded embeddings UI
2023-07-06 22:24:50 +10:00
psychedelicious
fbd6b25b4d
feat(ui): improve ux on TI autcomplete
...
- cursor reinserts at the end of the trigger
- `enter` closes the select
- popover styling
2023-07-06 14:56:37 +10:00
psychedelicious
2415dc1235
feat(ui): refactor embedding ui; now is autocomplete
2023-07-06 13:40:13 +10:00
Lincoln Stein
71dad6d404
Merge branch 'main' into ti-ui
2023-07-05 16:57:31 -04:00
Lincoln Stein
c21bd806f0
default LoRA weight to 0.75
2023-07-05 16:54:23 -04:00
Lincoln Stein
685a47cc7d
fix crash during lora application
2023-07-05 16:40:47 -04:00
Lincoln Stein
52498cc0b9
Put tokenizer and text encoder in same clip-vit-large-patch14 ( #3662 )
...
This PR fixes the migrate script so that it uses the same directory for
both the tokenizer and text encoder CLIP models. This will fix a crash
that occurred during checkpoint->diffusers conversions
This PR also removes the check for an existing models directory in the
target root directory when `invokeai-migrate3` is run.
2023-07-05 16:29:33 -04:00
Lincoln Stein
cb947bcbf0
Merge branch 'main' into lstein/fix-migrate3-textencoder
2023-07-05 16:23:00 -04:00
Lincoln Stein
bbfb5bb1d4
Remove hardcoded cuda device in model manager init ( #3624 )
...
There was a line in model_manager.py in which the GPU device was
hardcoded to "cuda". This has now been removed.
2023-07-05 16:22:45 -04:00
Lincoln Stein
9f9ce08e44
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-05 13:38:33 -04:00
Lincoln Stein
17c5568661
build: remove web ui dist from gitignore ( #3650 )
...
The web UI should manage its own .gitignore
I think would explain why certain files were not making it into the pypi
release
2023-07-05 13:36:16 -04:00
Lincoln Stein
94740e440d
Merge branch 'main' into build/gitignore
2023-07-05 13:35:54 -04:00
Lincoln Stein
cf173b522b
allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory
2023-07-05 13:14:41 -04:00
Mary Hipp Rogers
ea81ce9489
close modal when user clicks cancel ( #3656 )
...
* close modal when user clicks cancel
* close modal when delete image context cleared
---------
Co-authored-by: Mary Hipp <maryhipp@Marys-MacBook-Air.local>
2023-07-05 17:12:27 +00:00
blessedcoolant
8283b80b58
Fix ckpt scanning on conversion ( #3653 )
2023-07-06 05:09:13 +12:00
blessedcoolant
9e2d63ef97
Merge branch 'main' into fix/ckpt_convert_scan
2023-07-06 05:01:34 +12:00
blessedcoolant
dd946790ec
Fix loading diffusers ti ( #3661 )
2023-07-06 05:01:11 +12:00
Sergey Borisov
0ac9dca926
Fix loading diffusers ti
2023-07-05 19:46:00 +03:00
psychedelicious
acd3b1a512
build: remove web ui dist from gitignore
...
The web UI should manage its own .gitignore
2023-07-06 00:39:36 +10:00
blessedcoolant
e4d92da3a9
fix: Make space for icons in prompt box
2023-07-06 01:48:50 +12:00
blessedcoolant
9204b72383
feat: Make Embedding Picker a mini toggle
2023-07-06 01:45:00 +12:00
blessedcoolant
a556bf45bb
Merge branch 'main' into ti-ui
2023-07-05 23:42:48 +12:00
blessedcoolant
818616a0c5
fix(ui): fix prompt resize & style resizer ( #3652 )
2023-07-05 23:42:23 +12:00
blessedcoolant
8c9266359d
feat: Add Embedding Select To Linear UI
2023-07-05 23:41:15 +12:00
blessedcoolant
3b324a7d0a
Merge branch 'main' into fix/ui/fix-prompt-resize
2023-07-05 23:40:47 +12:00
blessedcoolant
c8cb43ff2d
Fix clip path in migrate script ( #3651 )
...
Update path for clip model according to path used in ckpt conversion and
invokeai-configure
2023-07-05 23:38:45 +12:00
Sergey Borisov
ee042ab76d
Fix ckpt scanning on conversion
2023-07-05 14:18:30 +03:00
psychedelicious
596c791844
fix(ui): fix prompt resize & style resizer
2023-07-05 21:02:31 +10:00
blessedcoolant
780e77d2ae
Merge branch 'main' into fix/clip_path
2023-07-05 22:45:52 +12:00
Sergey Borisov
e3fc1b3816
Fix clip path in migrate script
2023-07-05 13:43:09 +03:00
Lincoln Stein
9ad9e91a06
Detect invalid model names when migrating 2.3->3.0 ( #3623 )
...
A user discovered that 2.3 models whose symbolic names contain the "/"
character are not imported properly by the `migrate-models-3` script.
This fixes the issue by changing "/" to underscore at import time.
2023-07-05 06:35:54 -04:00
Lincoln Stein
307a01d604
when migrating models, changes / to _ in model names to avoid breaking model name keys
2023-07-05 20:27:03 +10:00
blessedcoolant
0981a7d049
fix(ui): fix dnd on nodes ( #3649 )
...
I had broken this earlier today
2023-07-05 21:09:36 +12:00
psychedelicious
2a7dee17be
fix(ui): fix dnd on nodes
...
I had broken this earlier today
2023-07-05 19:06:40 +10:00
blessedcoolant
6c6d600cea
fix(ui): deleting image selects first image ( #3648 )
...
@mickr777
2023-07-05 21:00:01 +12:00
blessedcoolant
1c7166d2c6
Merge branch 'main' into fix/ui/delete-image-select
2023-07-05 20:57:34 +12:00
blessedcoolant
07d7959dc0
feat(ui): improve accordion ux ( #3647 )
...
- Accordions now may be opened or closed regardless of whether or not
their contents are enabled or active
- Accordions have a short text indicator alerting the user if their
contents are enabled, either a simple `Enabled` or, for accordions like
LoRA or ControlNet, `X Active` if any are active
https://github.com/invoke-ai/InvokeAI/assets/4822129/43db63bd-7ef3-43f2-8dad-59fc7200af2e
2023-07-05 20:57:23 +12:00
psychedelicious
9ebab013c1
fix(ui): deleting image selects first image
2023-07-05 18:21:46 +10:00
psychedelicious
e41e8606b5
feat(ui): improve accordion ux
...
- Accordions now may be opened or closed regardless of whether or not their contents are enabled or active
- Accordions have a short text indicator alerting the user if their contents are enabled, either a simple `Enabled` or, for accordions like LoRA or ControlNet, `X Active` if any are active
2023-07-05 17:33:03 +10:00
blessedcoolant
6ce867feb4
Fix model detection ( #3646 )
2023-07-05 19:00:31 +12:00
blessedcoolant
bc8cfc2baa
Merge branch 'main' into fix/model_detect
2023-07-05 18:52:11 +12:00
Eugene Brodsky
7170e82f73
expose max_cache_size in config
2023-07-05 02:44:15 -04:00
Sergey Borisov
2beb8f049e
Fix model detection
2023-07-05 09:43:46 +03:00
blessedcoolant
66c10cc2f7
fix: Change Lora weight bounds to -1 to 2 ( #3645 )
2023-07-05 18:23:06 +12:00
blessedcoolant
1fb317243d
fix: Change Lora weight bounds to -1 to 2
2023-07-05 18:12:45 +12:00
blessedcoolant
71310a180d
feat: Add Lora to Canvas ( #3643 )
...
- Add Loras to Canvas
- Revert inference_mode to no_grad coz inference tensors fail with
latent to latent.
2023-07-05 17:15:28 +12:00
blessedcoolant
1a29a3fe39
feat: Add Lora to Canvas
2023-07-05 16:39:28 +12:00
blessedcoolant
639d88afd6
revert: inference_mode to no_grad
2023-07-05 16:39:15 +12:00