blessedcoolant
|
7aa918677e
|
Merge branch 'main' into feat/clip_skip
|
2023-07-07 16:21:53 +12:00 |
|
Lincoln Stein
|
e9352227f3
|
add merge api
|
2023-07-06 15:12:34 -04:00 |
|
blessedcoolant
|
bc5371eeee
|
Merge branch 'main' into feat/clip_skip
|
2023-07-07 06:03:39 +12:00 |
|
Lincoln Stein
|
e573a533ae
|
remove redundant import
|
2023-07-06 13:24:58 -04:00 |
|
Lincoln Stein
|
581be42c75
|
Merge branch 'main' into lstein/model-manager-router-api
|
2023-07-06 13:20:36 -04:00 |
|
Lincoln Stein
|
90c66aab3d
|
merge with upstream
|
2023-07-06 13:17:02 -04:00 |
|
Lincoln Stein
|
3e925fbf34
|
model merging API ready for testing
|
2023-07-06 13:15:15 -04:00 |
|
Lincoln Stein
|
ec7c2f07c6
|
model merge backend, CLI and TUI working
|
2023-07-06 12:21:42 -04:00 |
|
blessedcoolant
|
b229fe19aa
|
Merge branch 'main' into lstein/configure-max-cache-size
|
2023-07-07 01:52:12 +12:00 |
|
Sergey Borisov
|
04b57c408f
|
Add clip skip option to prompt node
|
2023-07-06 16:09:40 +03:00 |
|
blessedcoolant
|
6f1268e2b1
|
Merge branch 'main' into lstein/more-model-loading-fixes
|
2023-07-07 00:32:22 +12:00 |
|
Lincoln Stein
|
8f5fcb188c
|
Merge branch 'main' into lstein/model-manager-router-api
|
2023-07-05 23:16:43 -04:00 |
|
Lincoln Stein
|
f7daa6e71d
|
all methods now return OPENAPI_MODEL_CONFIGS; convert uses PUT
|
2023-07-05 23:13:01 -04:00 |
|
Lincoln Stein
|
a7cbcae176
|
expose max_cache_size to invokeai-configure interface
|
2023-07-05 20:59:57 -04:00 |
|
Lincoln Stein
|
43c51ff157
|
Merge branch 'main' into lstein/more-model-loading-fixes
|
2023-07-05 20:48:15 -04:00 |
|
Lincoln Stein
|
cfa3b2419c
|
partial implementation of merge
|
2023-07-05 20:25:47 -04:00 |
|
Lincoln Stein
|
d4550b3059
|
clean up lint errors in lora.py
|
2023-07-05 19:18:25 -04:00 |
|
Lincoln Stein
|
71dad6d404
|
Merge branch 'main' into ti-ui
|
2023-07-05 16:57:31 -04:00 |
|
Lincoln Stein
|
685a47cc7d
|
fix crash during lora application
|
2023-07-05 16:40:47 -04:00 |
|
Lincoln Stein
|
863336acbb
|
Recognize and load diffusers-style LoRAs (.bin)
Prevent double-reporting of autoimported models
- closes #3636
Allow autoimport of diffusers-style LoRA models
- closes #3637
|
2023-07-05 16:19:16 -04:00 |
|
Lincoln Stein
|
90ae8ce26a
|
prevent model install crash "torch needs to be restarted with spawn"
|
2023-07-05 16:18:20 -04:00 |
|
Lincoln Stein
|
5b6dd47b9f
|
add API for model convert
|
2023-07-05 15:13:21 -04:00 |
|
Lincoln Stein
|
5027d0a603
|
accept @psychedelicious suggestions above
|
2023-07-05 14:50:57 -04:00 |
|
Lincoln Stein
|
9f9ce08e44
|
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
|
2023-07-05 13:38:33 -04:00 |
|
blessedcoolant
|
9e2d63ef97
|
Merge branch 'main' into fix/ckpt_convert_scan
|
2023-07-06 05:01:34 +12:00 |
|
Sergey Borisov
|
0ac9dca926
|
Fix loading diffusers ti
|
2023-07-05 19:46:00 +03:00 |
|
Lincoln Stein
|
6112197edf
|
convert implemented; need router
|
2023-07-05 09:05:05 -04:00 |
|
Sergey Borisov
|
ee042ab76d
|
Fix ckpt scanning on conversion
|
2023-07-05 14:18:30 +03:00 |
|
Sergey Borisov
|
2beb8f049e
|
Fix model detection
|
2023-07-05 09:43:46 +03:00 |
|
blessedcoolant
|
639d88afd6
|
revert: inference_mode to no_grad
|
2023-07-05 16:39:15 +12:00 |
|
blessedcoolant
|
c0501ed5c2
|
fix: Slow loading of Loras
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
|
2023-07-05 12:47:34 +10:00 |
|
Lincoln Stein
|
5d099f4a49
|
update_model working
|
2023-07-04 17:26:57 -04:00 |
|
Lincoln Stein
|
752b4d50cf
|
model_delete method now working
|
2023-07-04 10:40:32 -04:00 |
|
Lincoln Stein
|
c1c49d9a76
|
import model returns 404 for invalid path, 409 for duplicate model
|
2023-07-04 10:08:10 -04:00 |
|
Lincoln Stein
|
96bf92ead4
|
add the import model router
|
2023-07-04 14:35:47 +10:00 |
|
Lincoln Stein
|
fc419546bc
|
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
|
2023-07-03 14:10:47 -04:00 |
|
Lincoln Stein
|
10d513c5f7
|
add runtime root path to relative vaes and other submodels
|
2023-07-03 11:19:33 -04:00 |
|
Lincoln Stein
|
2465c7987b
|
Revert "restore 3.9 compatibility by replacing | with Union[]"
This reverts commit 76bafeb99e .
|
2023-07-03 10:56:41 -04:00 |
|
Lincoln Stein
|
76bafeb99e
|
restore 3.9 compatibility by replacing | with Union[]
|
2023-07-03 10:55:04 -04:00 |
|
Lincoln Stein
|
0f02915012
|
remove hardcoded cuda device in model manager init
|
2023-07-01 21:15:42 -04:00 |
|
blessedcoolant
|
c74bb5cdbf
|
Merge branch 'main' into lstein/fix-vae-convert
|
2023-07-01 11:18:21 +12:00 |
|
Lincoln Stein
|
1347fc2f00
|
fix incorrect VAE config file path during conversion of ckpts
|
2023-06-30 19:14:06 -04:00 |
|
Lincoln Stein
|
ace4f6d586
|
fix duplicate model key addition when root directory is a relative path
|
2023-06-28 17:02:03 -04:00 |
|
StAlKeR7779
|
ac46b129bf
|
Merge branch 'main' into feat/lora_model_patch
|
2023-06-28 22:43:58 +03:00 |
|
Lincoln Stein
|
79fc708580
|
warn but do not crash when model scan finds random cruft in models directory
|
2023-06-28 15:26:42 -04:00 |
|
Lincoln Stein
|
e8ed0fad6c
|
autoimport from embedding/controlnet/lora folders designated in startup file
|
2023-06-27 12:30:53 -04:00 |
|
Lincoln Stein
|
823e098b7c
|
prompt user for prediction type when autoimporting a v2 model without .yaml file
don't ask user for prediction type of a config.yaml provided
|
2023-06-26 16:30:34 -04:00 |
|
Lincoln Stein
|
011adfc958
|
merge with main
|
2023-06-26 13:53:59 -04:00 |
|
Lincoln Stein
|
befd95eb19
|
rename root_dir to root_path attributes to emphasize return of a Path
|
2023-06-26 13:52:25 -04:00 |
|
Lincoln Stein
|
a2ddb3823b
|
fix add_model() logic
|
2023-06-26 13:33:38 -04:00 |
|