Lincoln Stein
90c66aab3d
merge with upstream
2023-07-06 13:17:02 -04:00
Lincoln Stein
3e925fbf34
model merging API ready for testing
2023-07-06 13:15:15 -04:00
Lincoln Stein
ec7c2f07c6
model merge backend, CLI and TUI working
2023-07-06 12:21:42 -04:00
psychedelicious
c21245f590
fix(api): make list models params querys, make path /
, remove defaults
...
The list models route should just be the base route path, and should use query parameters as opposed to path parameters (which cannot be optional)
Removed defaults for update model route - for the purposes of the API, we should always be explicit with this
2023-07-06 15:34:50 +10:00
Lincoln Stein
8f5fcb188c
Merge branch 'main' into lstein/model-manager-router-api
2023-07-05 23:16:43 -04:00
Lincoln Stein
f7daa6e71d
all methods now return OPENAPI_MODEL_CONFIGS; convert uses PUT
2023-07-05 23:13:01 -04:00
Lincoln Stein
3691b55565
fix autoimport crash
2023-07-05 21:53:08 -04:00
Lincoln Stein
cfa3b2419c
partial implementation of merge
2023-07-05 20:25:47 -04:00
Lincoln Stein
d4550b3059
clean up lint errors in lora.py
2023-07-05 19:18:25 -04:00
Lincoln Stein
52498cc0b9
Put tokenizer and text encoder in same clip-vit-large-patch14 ( #3662 )
...
This PR fixes the migrate script so that it uses the same directory for
both the tokenizer and text encoder CLIP models. This will fix a crash
that occurred during checkpoint->diffusers conversions
This PR also removes the check for an existing models directory in the
target root directory when `invokeai-migrate3` is run.
2023-07-05 16:29:33 -04:00
Lincoln Stein
cb947bcbf0
Merge branch 'main' into lstein/fix-migrate3-textencoder
2023-07-05 16:23:00 -04:00
Lincoln Stein
bbfb5bb1d4
Remove hardcoded cuda device in model manager init ( #3624 )
...
There was a line in model_manager.py in which the GPU device was
hardcoded to "cuda". This has now been removed.
2023-07-05 16:22:45 -04:00
Lincoln Stein
5b6dd47b9f
add API for model convert
2023-07-05 15:13:21 -04:00
Lincoln Stein
5027d0a603
accept @psychedelicious suggestions above
2023-07-05 14:50:57 -04:00
Lincoln Stein
9f9ce08e44
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-05 13:38:33 -04:00
Lincoln Stein
17c5568661
build: remove web ui dist from gitignore ( #3650 )
...
The web UI should manage its own .gitignore
I think would explain why certain files were not making it into the pypi
release
2023-07-05 13:36:16 -04:00
Lincoln Stein
94740e440d
Merge branch 'main' into build/gitignore
2023-07-05 13:35:54 -04:00
Lincoln Stein
cf173b522b
allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory
2023-07-05 13:14:41 -04:00
Mary Hipp Rogers
ea81ce9489
close modal when user clicks cancel ( #3656 )
...
* close modal when user clicks cancel
* close modal when delete image context cleared
---------
Co-authored-by: Mary Hipp <maryhipp@Marys-MacBook-Air.local>
2023-07-05 17:12:27 +00:00
blessedcoolant
8283b80b58
Fix ckpt scanning on conversion ( #3653 )
2023-07-06 05:09:13 +12:00
blessedcoolant
9e2d63ef97
Merge branch 'main' into fix/ckpt_convert_scan
2023-07-06 05:01:34 +12:00
blessedcoolant
dd946790ec
Fix loading diffusers ti ( #3661 )
2023-07-06 05:01:11 +12:00
Sergey Borisov
0ac9dca926
Fix loading diffusers ti
2023-07-05 19:46:00 +03:00
psychedelicious
acd3b1a512
build: remove web ui dist from gitignore
...
The web UI should manage its own .gitignore
2023-07-06 00:39:36 +10:00
Lincoln Stein
6112197edf
convert implemented; need router
2023-07-05 09:05:05 -04:00
blessedcoolant
818616a0c5
fix(ui): fix prompt resize & style resizer ( #3652 )
2023-07-05 23:42:23 +12:00
blessedcoolant
3b324a7d0a
Merge branch 'main' into fix/ui/fix-prompt-resize
2023-07-05 23:40:47 +12:00
blessedcoolant
c8cb43ff2d
Fix clip path in migrate script ( #3651 )
...
Update path for clip model according to path used in ckpt conversion and
invokeai-configure
2023-07-05 23:38:45 +12:00
Sergey Borisov
ee042ab76d
Fix ckpt scanning on conversion
2023-07-05 14:18:30 +03:00
psychedelicious
596c791844
fix(ui): fix prompt resize & style resizer
2023-07-05 21:02:31 +10:00
blessedcoolant
780e77d2ae
Merge branch 'main' into fix/clip_path
2023-07-05 22:45:52 +12:00
Sergey Borisov
e3fc1b3816
Fix clip path in migrate script
2023-07-05 13:43:09 +03:00
Lincoln Stein
9ad9e91a06
Detect invalid model names when migrating 2.3->3.0 ( #3623 )
...
A user discovered that 2.3 models whose symbolic names contain the "/"
character are not imported properly by the `migrate-models-3` script.
This fixes the issue by changing "/" to underscore at import time.
2023-07-05 06:35:54 -04:00
Lincoln Stein
307a01d604
when migrating models, changes / to _ in model names to avoid breaking model name keys
2023-07-05 20:27:03 +10:00
psychedelicious
56d4ea3252
fix(api): improve mm routes
2023-07-05 20:08:47 +10:00
psychedelicious
5d4d0e795c
fix(mm): fix up mm service types
2023-07-05 20:07:10 +10:00
blessedcoolant
0981a7d049
fix(ui): fix dnd on nodes ( #3649 )
...
I had broken this earlier today
2023-07-05 21:09:36 +12:00
psychedelicious
2a7dee17be
fix(ui): fix dnd on nodes
...
I had broken this earlier today
2023-07-05 19:06:40 +10:00
blessedcoolant
6c6d600cea
fix(ui): deleting image selects first image ( #3648 )
...
@mickr777
2023-07-05 21:00:01 +12:00
blessedcoolant
1c7166d2c6
Merge branch 'main' into fix/ui/delete-image-select
2023-07-05 20:57:34 +12:00
blessedcoolant
07d7959dc0
feat(ui): improve accordion ux ( #3647 )
...
- Accordions now may be opened or closed regardless of whether or not
their contents are enabled or active
- Accordions have a short text indicator alerting the user if their
contents are enabled, either a simple `Enabled` or, for accordions like
LoRA or ControlNet, `X Active` if any are active
https://github.com/invoke-ai/InvokeAI/assets/4822129/43db63bd-7ef3-43f2-8dad-59fc7200af2e
2023-07-05 20:57:23 +12:00
psychedelicious
9ebab013c1
fix(ui): deleting image selects first image
2023-07-05 18:21:46 +10:00
psychedelicious
e41e8606b5
feat(ui): improve accordion ux
...
- Accordions now may be opened or closed regardless of whether or not their contents are enabled or active
- Accordions have a short text indicator alerting the user if their contents are enabled, either a simple `Enabled` or, for accordions like LoRA or ControlNet, `X Active` if any are active
2023-07-05 17:33:03 +10:00
blessedcoolant
6ce867feb4
Fix model detection ( #3646 )
2023-07-05 19:00:31 +12:00
blessedcoolant
bc8cfc2baa
Merge branch 'main' into fix/model_detect
2023-07-05 18:52:11 +12:00
Eugene Brodsky
7170e82f73
expose max_cache_size in config
2023-07-05 02:44:15 -04:00
Sergey Borisov
2beb8f049e
Fix model detection
2023-07-05 09:43:46 +03:00
blessedcoolant
66c10cc2f7
fix: Change Lora weight bounds to -1 to 2 ( #3645 )
2023-07-05 18:23:06 +12:00
blessedcoolant
1fb317243d
fix: Change Lora weight bounds to -1 to 2
2023-07-05 18:12:45 +12:00
blessedcoolant
71310a180d
feat: Add Lora to Canvas ( #3643 )
...
- Add Loras to Canvas
- Revert inference_mode to no_grad coz inference tensors fail with
latent to latent.
2023-07-05 17:15:28 +12:00