blessedcoolant
808b2de709
Merge branch 'main' into lstein/model-manager-route-enhancements
2023-07-15 16:56:54 +12:00
Lincoln Stein
2faa7cee37
add rename_model route
2023-07-14 23:03:18 -04:00
Brandon
467414f214
Merge branch 'main' into update-textual-inversion-training
2023-07-14 22:32:09 -04:00
Brandon Rising
f88a338be0
Setup textual inversion training with new model manager
2023-07-14 21:58:51 -04:00
Lincoln Stein
b306247eb5
remove clipseg model install
2023-07-14 20:39:42 -04:00
Lincoln Stein
a45f7ce355
add --list-models command
2023-07-14 19:52:47 -04:00
Lincoln Stein
eb9d74653d
set default models for realesrgan, controlnet and text inversion
2023-07-14 19:03:41 -04:00
Sergey Borisov
7093e5d033
Pad conditionings using zeros and encoder_attention_mask
2023-07-15 00:52:54 +03:00
Lincoln Stein
e71ce83e9c
Merge branch 'main' into lstein/model-manager-route-enhancements
2023-07-14 13:52:55 -04:00
Lincoln Stein
8600aad12b
multiple enhancements to model manager REACT API
...
1. add a /sync route for synchronizing the in-memory model lists to
models.yaml, the models directory, and the autoimport directories.
2. add optional destination_directories to convert_model and merge_model
operations.
3. add /ckpt_confs route for retrieving known legacy checkpoint configuration
files.
4. add /search route for finding all models in a directory located in the server
filesystem
2023-07-14 13:45:16 -04:00
Lincoln Stein
ad076b1174
add model directory search route
2023-07-14 11:14:33 -04:00
blessedcoolant
16e93c6455
Merge branch 'main' into mm-ui
2023-07-14 15:46:53 +12:00
blessedcoolant
9348dc8e0d
Merge branch 'main' into fix/controlnet_cfg_inj_cond
2023-07-14 01:11:04 +12:00
blessedcoolant
5a546e66f1
Merge branch 'main' into fix-inpainting
2023-07-13 20:42:13 +12:00
psychedelicious
eb0d55263b
fix(mm): make model config attribute names consistent
...
Our model fields use `model_name`, but the API response uses `name`. Some places use `model_type` but the API response used `type`.
Changed the API response to provide `model_name` and `model_type`, which simplifies how we manage models on the client substantially.
2023-07-13 15:40:05 +10:00
blessedcoolant
7e3b9f1320
fix: Inpaint not working with some schedulers
...
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-07-13 15:06:03 +12:00
blessedcoolant
71e34ac256
Merge branch 'main' into mm-ui
2023-07-13 12:48:43 +12:00
Sergey Borisov
67c8cf4bc2
Controlnet model detection
2023-07-12 08:50:19 -04:00
Sergey Borisov
a328986b43
Less naive model detection
2023-07-12 08:50:19 -04:00
blessedcoolant
5a6ad99d4e
feat: Restore Delete Model Functionality
2023-07-12 16:39:07 +12:00
StAlKeR7779
b8a9b499df
Merge branch 'main' into fix/controlnet_cfg_inj_cond
2023-07-11 23:43:47 +03:00
Lincoln Stein
25591788c1
fix conflicts
2023-07-11 15:55:10 -04:00
Lincoln Stein
dab03fb646
rename gpu_mem_reserved to max_vram_cache_size
...
To be consistent with max_cache_size, the amount of memory to hold in
VRAM for model caching is now controlled by the max_vram_cache_size
configuration parameter.
2023-07-11 15:25:39 -04:00
Lincoln Stein
d32f9f7cb0
reverse logic of gpu_mem_reserved
...
- gpu_mem_reserved now indicates the amount of VRAM that will be reserved
for model caching (similar to max_cache_size).
2023-07-11 15:16:40 -04:00
Lincoln Stein
6b93c1451f
do not crash when probing an unknown model type
2023-07-11 10:56:47 -04:00
Lincoln Stein
bf2b5b5cd4
improvements to sdxl support in model manager
...
- Move SDXL-related models to models/sdxl.py
- Create separate base type BaseModelType.StableDiffusionXLRefiner for the refiner
models.
2023-07-09 20:42:03 -04:00
Lincoln Stein
5759a390f9
introduce gpu_mem_reserved configuration parameter
2023-07-09 18:35:04 -04:00
Lincoln Stein
130249a2dd
add model loading support for SDXL
2023-07-09 15:47:06 -04:00
Lincoln Stein
8d7dba937d
fix undefined variable
2023-07-09 14:37:45 -04:00
Lincoln Stein
d6cb0e54b3
don't unload models from GPU until the space is needed
2023-07-09 14:26:30 -04:00
Lincoln Stein
5f7435955e
if models.yaml doesn't exist, rebuild it
2023-07-08 15:13:51 -04:00
Lincoln Stein
69ef1e1e56
speculative change to upgrade script
2023-07-08 11:45:26 -04:00
Lincoln Stein
10d3bccf32
Mac MPS FP16 fixes ( #3641 )
...
This PR is to allow FP16 precision to work on Macs with MPS. In
addition, it centralizes the torch fixes/workarounds required for MPS
into a new backend utility `mps_fixes.py`. This is conditionally
imported in `api_app.py`/`cli_app.py`.
Many MANY thanks to @StAlKeR7779 for patiently working to debug and fix
these issues.
2023-07-07 17:43:23 -04:00
Eugene Brodsky
97b2ec58e2
Merge branch 'main' into release/invokeai-3-0-alpha
2023-07-07 14:18:12 -04:00
Lincoln Stein
56f4712814
fix checkpoint VAE handling in migrate script
2023-07-07 09:34:42 -04:00
Lincoln Stein
9f58ed35cf
improve user migration experience
...
- No longer fail root directory probing if invokeai.yaml is missing
(test is now whether a `models/core` directory exists).
- Migrate script does not overwrite previously-installed models.
- Can run migrate script on an existing 2.3 version directory
with --from and --to pointing to same 2.3 root.
2023-07-07 08:18:46 -04:00
blessedcoolant
7aa918677e
Merge branch 'main' into feat/clip_skip
2023-07-07 16:21:53 +12:00
Lincoln Stein
54f3686e3b
merge with main, fix conflicts
2023-07-06 15:21:45 -04:00
Lincoln Stein
e9352227f3
add merge api
2023-07-06 15:12:34 -04:00
blessedcoolant
bc5371eeee
Merge branch 'main' into feat/clip_skip
2023-07-07 06:03:39 +12:00
Lincoln Stein
e573a533ae
remove redundant import
2023-07-06 13:24:58 -04:00
Lincoln Stein
581be42c75
Merge branch 'main' into lstein/model-manager-router-api
2023-07-06 13:20:36 -04:00
Lincoln Stein
90c66aab3d
merge with upstream
2023-07-06 13:17:02 -04:00
Lincoln Stein
3e925fbf34
model merging API ready for testing
2023-07-06 13:15:15 -04:00
Lincoln Stein
ec7c2f07c6
model merge backend, CLI and TUI working
2023-07-06 12:21:42 -04:00
blessedcoolant
b229fe19aa
Merge branch 'main' into lstein/configure-max-cache-size
2023-07-07 01:52:12 +12:00
Sergey Borisov
04b57c408f
Add clip skip option to prompt node
2023-07-06 16:09:40 +03:00
blessedcoolant
6f1268e2b1
Merge branch 'main' into lstein/more-model-loading-fixes
2023-07-07 00:32:22 +12:00
Lincoln Stein
8f5fcb188c
Merge branch 'main' into lstein/model-manager-router-api
2023-07-05 23:16:43 -04:00
Lincoln Stein
f7daa6e71d
all methods now return OPENAPI_MODEL_CONFIGS; convert uses PUT
2023-07-05 23:13:01 -04:00
Lincoln Stein
3691b55565
fix autoimport crash
2023-07-05 21:53:08 -04:00
Lincoln Stein
f610045a14
Merge branch 'main' into mps-fp16-fixes
2023-07-05 21:01:48 -04:00
Lincoln Stein
a7cbcae176
expose max_cache_size to invokeai-configure interface
2023-07-05 20:59:57 -04:00
Lincoln Stein
0a6dccd607
expose max_cache_size to invokeai-configure interface
2023-07-05 20:59:14 -04:00
Lincoln Stein
43c51ff157
Merge branch 'main' into lstein/more-model-loading-fixes
2023-07-05 20:48:15 -04:00
Lincoln Stein
cfa3b2419c
partial implementation of merge
2023-07-05 20:25:47 -04:00
Lincoln Stein
d4550b3059
clean up lint errors in lora.py
2023-07-05 19:18:25 -04:00
Lincoln Stein
83d3a043da
merge latest changes from main
2023-07-05 19:15:53 -04:00
gogurtenjoyer
169ff6368b
Update mps_fixes.py - additional torch op for nodes
...
This fixes scaling in the nodes UI.
2023-07-05 17:47:23 -04:00
Lincoln Stein
71dad6d404
Merge branch 'main' into ti-ui
2023-07-05 16:57:31 -04:00
Lincoln Stein
685a47cc7d
fix crash during lora application
2023-07-05 16:40:47 -04:00
Lincoln Stein
cb947bcbf0
Merge branch 'main' into lstein/fix-migrate3-textencoder
2023-07-05 16:23:00 -04:00
Lincoln Stein
f8bbec8572
Recognize and load diffusers-style LoRAs (.bin)
...
Prevent double-reporting of autoimported models
- closes #3636
Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:21:23 -04:00
Lincoln Stein
863336acbb
Recognize and load diffusers-style LoRAs (.bin)
...
Prevent double-reporting of autoimported models
- closes #3636
Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:19:16 -04:00
Lincoln Stein
90ae8ce26a
prevent model install crash "torch needs to be restarted with spawn"
2023-07-05 16:18:20 -04:00
Lincoln Stein
ad5d90aca8
prevent model install crash "torch needs to be restarted with spawn"
2023-07-05 15:38:07 -04:00
Lincoln Stein
5b6dd47b9f
add API for model convert
2023-07-05 15:13:21 -04:00
Lincoln Stein
5027d0a603
accept @psychedelicious suggestions above
2023-07-05 14:50:57 -04:00
Lincoln Stein
9f9ce08e44
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-05 13:38:33 -04:00
Lincoln Stein
021e1eca8e
Merge branch 'main' into mps-fp16-fixes
2023-07-05 13:19:52 -04:00
Lincoln Stein
5fe722900d
allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory
2023-07-05 13:15:08 -04:00
Lincoln Stein
cf173b522b
allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory
2023-07-05 13:14:41 -04:00
blessedcoolant
9e2d63ef97
Merge branch 'main' into fix/ckpt_convert_scan
2023-07-06 05:01:34 +12:00
Sergey Borisov
0ac9dca926
Fix loading diffusers ti
2023-07-05 19:46:00 +03:00
Lincoln Stein
bd82c4ace0
model installer confirms deletion of models
2023-07-05 09:57:23 -04:00
Lincoln Stein
9edf78dd2e
merge with main
2023-07-05 09:12:54 -04:00
Lincoln Stein
6112197edf
convert implemented; need router
2023-07-05 09:05:05 -04:00
gogurtenjoyer
ba7345deb4
Merge branch 'main' into mps-fp16-fixes
2023-07-05 07:38:41 -04:00
Sergey Borisov
ee042ab76d
Fix ckpt scanning on conversion
2023-07-05 14:18:30 +03:00
blessedcoolant
780e77d2ae
Merge branch 'main' into fix/clip_path
2023-07-05 22:45:52 +12:00
Sergey Borisov
e3fc1b3816
Fix clip path in migrate script
2023-07-05 13:43:09 +03:00
Lincoln Stein
307a01d604
when migrating models, changes / to _ in model names to avoid breaking model name keys
2023-07-05 20:27:03 +10:00
Sergey Borisov
2beb8f049e
Fix model detection
2023-07-05 09:43:46 +03:00
blessedcoolant
639d88afd6
revert: inference_mode to no_grad
2023-07-05 16:39:15 +12:00
blessedcoolant
c0501ed5c2
fix: Slow loading of Loras
...
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-07-05 12:47:34 +10:00
gogurtenjoyer
233869b56a
Mac MPS FP16 fixes
...
This PR is to allow FP16 precision to work on Macs with MPS. In addition, it centralizes the torch fixes/workarounds
required for MPS into a new backend utility file `mps_fixes.py`. This is conditionally imported in `api_app.py`/`cli_app.py`.
Many MANY thanks to StAlKeR7779 for patiently working to debug and fix these issues.
2023-07-04 18:10:53 -04:00
Lincoln Stein
5d099f4a49
update_model working
2023-07-04 17:26:57 -04:00
Lincoln Stein
752b4d50cf
model_delete method now working
2023-07-04 10:40:32 -04:00
Lincoln Stein
c1c49d9a76
import model returns 404 for invalid path, 409 for duplicate model
2023-07-04 10:08:10 -04:00
Lincoln Stein
96bf92ead4
add the import model router
2023-07-04 14:35:47 +10:00
Lincoln Stein
fc419546bc
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-03 14:10:47 -04:00
Lincoln Stein
cfd09214d3
Merge branch 'main' into lstein/fix-vae-conversion-crash
2023-07-03 14:03:13 -04:00
Lincoln Stein
b128ba81db
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-03 13:58:14 -04:00
Lincoln Stein
d6de11bd56
resolve merge conflict
2023-07-03 12:19:11 -04:00
Lincoln Stein
ed86d0b708
Union[foo, None]=>Optional[foo]
2023-07-03 12:17:45 -04:00
Lincoln Stein
fb2b2a371d
Merge branch 'lstein/fix-vae-conversion-crash' into release/invokeai-3-0-alpha
2023-07-03 11:21:16 -04:00
Lincoln Stein
10d513c5f7
add runtime root path to relative vaes and other submodels
2023-07-03 11:19:33 -04:00
Lincoln Stein
877b187a1b
Merge branch 'lstein/restore-3.9-compatibility' into release/invokeai-3-0-alpha
2023-07-03 11:01:34 -04:00
Lincoln Stein
ac9ec4e75a
restore 3.9 compatibility by replacing | with Union[]
2023-07-03 10:57:40 -04:00
Lincoln Stein
2465c7987b
Revert "restore 3.9 compatibility by replacing | with Union[]"
...
This reverts commit 76bafeb99e
.
2023-07-03 10:56:41 -04:00
Lincoln Stein
76bafeb99e
restore 3.9 compatibility by replacing | with Union[]
2023-07-03 10:55:04 -04:00
Lincoln Stein
6935858ef3
add debugging messages to aid in memory leak tracking
2023-07-02 13:34:53 -04:00
Lincoln Stein
fa1f9939cc
adjust invokeai-configure TUI vertical height to show NEXT button on Mac
2023-07-02 09:44:16 -04:00
Lincoln Stein
2d314d2b3d
another fix to repo_id loading
2023-07-02 09:18:11 -04:00
Lincoln Stein
b2775d6b4c
Merge branch 'lstein/recognize-legacy-sampler-names' into release/invokeai-3-0-alpha
2023-07-01 21:45:39 -04:00
Lincoln Stein
06694d465d
add missing k-* legacy sampler names to init file migrate list
2023-07-01 21:45:14 -04:00
Lincoln Stein
3c2ce51f10
Merge branch 'lstein/remove-hardcoded-cuda-device' into release/invokeai-3-0-alpha
2023-07-01 21:15:58 -04:00
Lincoln Stein
0f02915012
remove hardcoded cuda device in model manager init
2023-07-01 21:15:42 -04:00
Lincoln Stein
0016236889
Merge branch 'lstein/fix-imported-model-names' into release/invokeai-3-0-alpha
2023-07-01 21:09:29 -04:00
Lincoln Stein
f4bd5bb986
when migrating models, changes / to _ in model names to avoid breaking model name keys
2023-07-01 21:08:59 -04:00
Lincoln Stein
5de820f2dc
fix updater and model installer
2023-07-01 20:13:28 -04:00
Lincoln Stein
41a8f155ed
Merge branch 'main' into fix/controlnet_cfg_inj_cond
2023-07-01 14:36:09 -04:00
Lincoln Stein
f1928d2588
prevent crashes on malformed models
2023-07-01 14:32:58 -04:00
blessedcoolant
c74bb5cdbf
Merge branch 'main' into lstein/fix-vae-convert
2023-07-01 11:18:21 +12:00
Lincoln Stein
1347fc2f00
fix incorrect VAE config file path during conversion of ckpts
2023-06-30 19:14:06 -04:00
blessedcoolant
5be1e71d1b
Merge branch 'main' into lstein/fix-model-scan-on-rel-root
2023-06-29 17:54:12 +12:00
mickr777
30a917f70c
Fix Typo in migrate_to_3.py
2023-06-29 14:45:55 +10:00
Lincoln Stein
ace4f6d586
fix duplicate model key addition when root directory is a relative path
2023-06-28 17:02:03 -04:00
Lincoln Stein
20fbe81395
Merge branch 'main' into fix/controlnet_cfg_inj_cond
2023-06-28 15:44:50 -04:00
StAlKeR7779
ac46b129bf
Merge branch 'main' into feat/lora_model_patch
2023-06-28 22:43:58 +03:00
Lincoln Stein
79fc708580
warn but do not crash when model scan finds random cruft in models
directory
2023-06-28 15:26:42 -04:00
Lincoln Stein
e8ed0fad6c
autoimport from embedding/controlnet/lora folders designated in startup file
2023-06-27 12:30:53 -04:00
Sergey Borisov
dc1f220b3e
Fix wrong conditioning used
2023-06-27 01:18:15 +03:00
Lincoln Stein
044fe6bb20
remove dangling debug statement
2023-06-26 17:48:06 -04:00
Lincoln Stein
823e098b7c
prompt user for prediction type when autoimporting a v2 model without .yaml file
...
don't ask user for prediction type of a config.yaml provided
2023-06-26 16:30:34 -04:00
Lincoln Stein
011adfc958
merge with main
2023-06-26 13:53:59 -04:00
Lincoln Stein
befd95eb19
rename root_dir to root_path attributes to emphasize return of a Path
2023-06-26 13:52:25 -04:00
Lincoln Stein
a2ddb3823b
fix add_model() logic
2023-06-26 13:33:38 -04:00
Eugene Brodsky
7b97639961
Merge branch 'main' into lstein/installer-for-new-model-layout
2023-06-26 01:24:30 -04:00
Sergey Borisov
91c3a58fb6
Fix lycoris layers init
2023-06-26 04:33:37 +03:00
Sergey Borisov
5cebf67ee4
Apply lora by patching lora instead of hooks
2023-06-26 03:57:33 +03:00
Sergey Borisov
1ba94a92b3
Fixes
2023-06-26 03:54:42 +03:00
Sergey Borisov
23c22ac933
Refactor logic/small fixes
2023-06-26 03:07:54 +03:00
Lincoln Stein
160b5d7992
add support for an autoimport models directory scanned at startup time
2023-06-25 18:50:15 -04:00
Lincoln Stein
c91d1eacba
Merge branch 'lstein/installer-for-new-model-layout' of github.com:invoke-ai/InvokeAI into lstein/installer-for-new-model-layout
2023-06-25 16:04:48 -04:00
Lincoln Stein
60b37b7ff4
fix model manager documentation
2023-06-25 16:04:43 -04:00
Sergey Borisov
a3c22b5fe6
Remove upcast_attention and prediction_type from stable diffusion model logic, fix ckpt conversion according to this
2023-06-25 21:06:22 +03:00
user1
c5faffc18b
Merge branch 'main' of github.com:invoke-ai/InvokeAI into feat/controlnet-control-modes
...
Only "real" conflicts were in:
invokeai/frontend/web/src/features/controlNet/components/ControlNet.tsx
invokeai/frontend/web/src/features/controlNet/store/controlNetSlice.ts
2023-06-24 17:05:57 -07:00
Lincoln Stein
c3c4a71173
implemented Stalker's suggested improvements
2023-06-24 12:37:26 -04:00
Lincoln Stein
ba1371a88f
rename ModelType.Pipeline to ModelType.Main
2023-06-24 11:45:49 -04:00
Lincoln Stein
539d1f3bde
remove redundant prediction_type and attention_upscaling flags
2023-06-23 16:54:52 -04:00
Lincoln Stein
466ec3ab5e
add router API support for model manager heuristic_import()`
2023-06-23 16:35:39 -04:00
Lincoln Stein
54b74427f4
adjust for change in list_models() API
2023-06-23 14:13:37 -04:00
Lincoln Stein
58d1857ab6
merge with main
2023-06-23 13:57:25 -04:00
Lincoln Stein
3043af4620
implement vae passthru
2023-06-23 13:56:30 -04:00
Lincoln Stein
56bd873d7a
make relative model paths work in model manager
2023-06-23 10:52:59 -04:00
Sergey Borisov
5aaaaf64a1
Fix ckpt conversion
2023-06-23 17:29:54 +03:00
StAlKeR7779
9140e2c0f2
Merge branch 'main' into fix/vae_conversion
2023-06-23 15:03:59 +03:00
Lincoln Stein
a910403003
correctly migrate models that have relative paths
2023-06-22 21:10:31 -04:00
Lincoln Stein
c7b7e087e4
Merge branch 'main' into lstein/installer-for-new-model-layout
2023-06-23 01:45:05 +01:00