Alexandre Macabies
0beec08d38
Add missing import.
2023-07-23 16:40:05 +02:00
Lincoln Stein
f2a6f0cf21
SDXL & SDXL-refiner models convert correctly
2023-07-23 09:31:14 -04:00
Alexandre Macabies
07a90c0198
Fix incorrect use of a singleton list.
...
This was found through pylance type errors. Go types!
2023-07-23 15:28:05 +02:00
Lincoln Stein
5e59edfaf1
SDXL checkpoint models now convert and load; needs refactor
2023-07-23 00:00:31 -04:00
Lincoln Stein
b1d7c9b306
save text_encoder_2 config, not whole model
2023-07-22 21:33:40 -04:00
Lincoln Stein
5607794dbb
add support for controlnet & sdxl conversion - not fully working
2023-07-22 20:12:16 -04:00
Lincoln Stein
845d1524ad
warn, do not crash, when duplicate models encountered
2023-07-21 15:00:55 -04:00
Brandon Rising
78750042f5
Pass in dim overrides
2023-07-21 12:16:24 -04:00
Lincoln Stein
9370572169
prettify startup messages
2023-07-20 22:45:35 -04:00
Brandon Rising
ba1a934297
Fix Lora typings
2023-07-20 14:02:23 -04:00
Brandon Rising
4e90376d11
Allow passing in of precision, use available providers if none provided
2023-07-20 13:15:45 -04:00
Lincoln Stein
b1a6ba552b
reinitialize models.yaml if corrupt or missing
2023-07-20 11:26:20 -04:00
Lincoln Stein
89a15f78dd
collapse all autoimport directories into a single folder
2023-07-20 09:01:49 -04:00
Brandon Rising
43b6a077fb
io binding seems to be massively resource intensive compared to session.run
2023-07-19 17:42:28 -04:00
Lincoln Stein
a1251c8e04
fix inpaint model detection
2023-07-19 13:30:00 -04:00
Lincoln Stein
8439e30798
Merge branch 'main' into release/invokeai-3-0-beta
2023-07-19 12:09:32 -04:00
Lincoln Stein
0b6ef7eb7d
make the convert VAE available to model manager for use in UI
2023-07-19 09:05:24 -04:00
Brandon Rising
e8299d0abb
Comment out erroniously removed del statement, comment out opt tests
2023-07-18 23:23:34 -04:00
Sergey Borisov
2e7fc055c4
Support both pre and post 4.31.0 transformers
2023-07-19 06:15:17 +03:00
Brandon Rising
f4e52fafac
Fix as part of merging main in
2023-07-18 23:05:33 -04:00
Brandon Rising
ee7b36cea5
Merge branch 'main' into onnx-testing
2023-07-18 22:56:41 -04:00
Lincoln Stein
a690cca5b5
make convert work with both 4.30.2 and 4.31.0
2023-07-18 22:18:13 -04:00
Brandon Rising
e201ad2f51
Switch to io_binding for run, testing different session options
2023-07-18 21:54:54 -04:00
Sergey Borisov
0aa7193d3b
Load text_model.embeddings.position_ids outsude state_dict
2023-07-19 04:18:43 +03:00
blessedcoolant
186e98da5e
Merge branch 'main' into fix/mem_cleanup
2023-07-19 10:10:32 +12:00
Eugene Brodsky
dea9a5da7a
Avoid crash if unable to modify the model config file ( #3824 )
...
* fix whitespace; remove invisible characters
* log error and proceed if unable to modify the model config
2023-07-18 16:33:19 -04:00
Sergey Borisov
bda0000acd
Cleanup vram after models offloading, tweak to cleanup local variable references on ram offload
2023-07-18 23:21:18 +03:00
Brandon Rising
35d5ef9118
Emit step completions
2023-07-18 12:35:07 -04:00
StAlKeR7779
889b77d3d6
Merge branch 'main' into save_vram
2023-07-18 16:55:48 +03:00
Sergey Borisov
bc11296a5e
Disable lazy offloading on disabled vram cache, move resulted tensors to cpu(to not stack vram tensors in cache), fix - text encoder not freed(detach)
2023-07-18 16:20:25 +03:00
Lincoln Stein
1353bf98b3
add specific exception for model probe failures
2023-07-17 23:08:39 -04:00
blessedcoolant
13da881953
Merge branch 'main' into sdxl-support
2023-07-18 13:34:07 +12:00
Brandon Rising
bcce70fca6
Testing different session opts, added timings for testing
2023-07-17 16:27:33 -04:00
Lincoln Stein
0ea8d3c30c
prevent crash on rename operation on models in models directory
2023-07-17 07:50:06 -04:00
Lincoln Stein
84a13ff8e1
Merge branch 'mm-ui' of github.com:blessedcoolant/InvokeAI into mm-ui
2023-07-17 07:29:35 -04:00
Lincoln Stein
3fba262c94
expose paths as absolute to web api
2023-07-17 07:29:26 -04:00
Lincoln Stein
107ca6bf47
expose model paths as absolute to web models API
2023-07-17 07:26:05 -04:00
blessedcoolant
cbfd1d1b27
Merge branch 'main' into feat/standalone_diffusers_ti
2023-07-17 22:01:52 +12:00
blessedcoolant
aebd595607
Merge branch 'main' into mm-ui
2023-07-17 13:49:25 +12:00
Kent Keirsey
675a92401c
Merge branch 'main' into lstein/default-model-install
2023-07-16 19:32:59 -04:00
Sergey Borisov
b61c83e836
Allow bin extension to detect diffusers-ti provided as file
2023-07-17 00:32:17 +03:00
Lincoln Stein
2bc3e36bc0
add missing exception name
2023-07-16 16:14:28 -04:00
Lincoln Stein
6fbb5ce780
add renaming capabilities to model update API route
2023-07-16 14:17:05 -04:00
Brandon Rising
932112b640
testing being super wasteful with data
2023-07-16 00:17:33 -04:00
Lincoln Stein
ccbfa5d862
resolve conflicts
2023-07-15 19:47:50 -04:00
Lincoln Stein
6b0a158ffa
Merge branch 'main' into lstein/default-model-install
2023-07-15 18:23:34 -04:00
Lincoln Stein
e01706f5f5
add fp16 support to controlnet models
2023-07-15 10:37:11 -04:00
Lincoln Stein
32e7e52d69
Merge branch 'main' into lstein/default-model-install
2023-07-15 08:30:22 -04:00
Lincoln Stein
2faa7cee37
add rename_model route
2023-07-14 23:03:18 -04:00
Lincoln Stein
a45f7ce355
add --list-models command
2023-07-14 19:52:47 -04:00
Brandon Rising
bd7b59910d
Testing onnx in new ui updates
2023-07-14 14:24:15 -04:00
Lincoln Stein
e71ce83e9c
Merge branch 'main' into lstein/model-manager-route-enhancements
2023-07-14 13:52:55 -04:00
Lincoln Stein
8600aad12b
multiple enhancements to model manager REACT API
...
1. add a /sync route for synchronizing the in-memory model lists to
models.yaml, the models directory, and the autoimport directories.
2. add optional destination_directories to convert_model and merge_model
operations.
3. add /ckpt_confs route for retrieving known legacy checkpoint configuration
files.
4. add /search route for finding all models in a directory located in the server
filesystem
2023-07-14 13:45:16 -04:00
Lincoln Stein
ad076b1174
add model directory search route
2023-07-14 11:14:33 -04:00
blessedcoolant
16e93c6455
Merge branch 'main' into mm-ui
2023-07-14 15:46:53 +12:00
Brandon Rising
524888bf3b
Merge branch 'main' into feat/onnx
2023-07-13 14:23:57 -04:00
psychedelicious
eb0d55263b
fix(mm): make model config attribute names consistent
...
Our model fields use `model_name`, but the API response uses `name`. Some places use `model_type` but the API response used `type`.
Changed the API response to provide `model_name` and `model_type`, which simplifies how we manage models on the client substantially.
2023-07-13 15:40:05 +10:00
blessedcoolant
71e34ac256
Merge branch 'main' into mm-ui
2023-07-13 12:48:43 +12:00
Sergey Borisov
67c8cf4bc2
Controlnet model detection
2023-07-12 08:50:19 -04:00
Sergey Borisov
a328986b43
Less naive model detection
2023-07-12 08:50:19 -04:00
blessedcoolant
5a6ad99d4e
feat: Restore Delete Model Functionality
2023-07-12 16:39:07 +12:00
Lincoln Stein
25591788c1
fix conflicts
2023-07-11 15:55:10 -04:00
Lincoln Stein
dab03fb646
rename gpu_mem_reserved to max_vram_cache_size
...
To be consistent with max_cache_size, the amount of memory to hold in
VRAM for model caching is now controlled by the max_vram_cache_size
configuration parameter.
2023-07-11 15:25:39 -04:00
Lincoln Stein
d32f9f7cb0
reverse logic of gpu_mem_reserved
...
- gpu_mem_reserved now indicates the amount of VRAM that will be reserved
for model caching (similar to max_cache_size).
2023-07-11 15:16:40 -04:00
Lincoln Stein
6b93c1451f
do not crash when probing an unknown model type
2023-07-11 10:56:47 -04:00
Lincoln Stein
bf2b5b5cd4
improvements to sdxl support in model manager
...
- Move SDXL-related models to models/sdxl.py
- Create separate base type BaseModelType.StableDiffusionXLRefiner for the refiner
models.
2023-07-09 20:42:03 -04:00
Lincoln Stein
5759a390f9
introduce gpu_mem_reserved configuration parameter
2023-07-09 18:35:04 -04:00
Lincoln Stein
130249a2dd
add model loading support for SDXL
2023-07-09 15:47:06 -04:00
Lincoln Stein
8d7dba937d
fix undefined variable
2023-07-09 14:37:45 -04:00
Lincoln Stein
d6cb0e54b3
don't unload models from GPU until the space is needed
2023-07-09 14:26:30 -04:00
Lincoln Stein
5f7435955e
if models.yaml doesn't exist, rebuild it
2023-07-08 15:13:51 -04:00
Lincoln Stein
69ef1e1e56
speculative change to upgrade script
2023-07-08 11:45:26 -04:00
Eugene Brodsky
97b2ec58e2
Merge branch 'main' into release/invokeai-3-0-alpha
2023-07-07 14:18:12 -04:00
Lincoln Stein
9f58ed35cf
improve user migration experience
...
- No longer fail root directory probing if invokeai.yaml is missing
(test is now whether a `models/core` directory exists).
- Migrate script does not overwrite previously-installed models.
- Can run migrate script on an existing 2.3 version directory
with --from and --to pointing to same 2.3 root.
2023-07-07 08:18:46 -04:00
blessedcoolant
7aa918677e
Merge branch 'main' into feat/clip_skip
2023-07-07 16:21:53 +12:00
Lincoln Stein
54f3686e3b
merge with main, fix conflicts
2023-07-06 15:21:45 -04:00
Lincoln Stein
e9352227f3
add merge api
2023-07-06 15:12:34 -04:00
blessedcoolant
bc5371eeee
Merge branch 'main' into feat/clip_skip
2023-07-07 06:03:39 +12:00
Lincoln Stein
e573a533ae
remove redundant import
2023-07-06 13:24:58 -04:00
Lincoln Stein
581be42c75
Merge branch 'main' into lstein/model-manager-router-api
2023-07-06 13:20:36 -04:00
Lincoln Stein
90c66aab3d
merge with upstream
2023-07-06 13:17:02 -04:00
Lincoln Stein
3e925fbf34
model merging API ready for testing
2023-07-06 13:15:15 -04:00
Lincoln Stein
ec7c2f07c6
model merge backend, CLI and TUI working
2023-07-06 12:21:42 -04:00
blessedcoolant
b229fe19aa
Merge branch 'main' into lstein/configure-max-cache-size
2023-07-07 01:52:12 +12:00
Sergey Borisov
04b57c408f
Add clip skip option to prompt node
2023-07-06 16:09:40 +03:00
blessedcoolant
6f1268e2b1
Merge branch 'main' into lstein/more-model-loading-fixes
2023-07-07 00:32:22 +12:00
Lincoln Stein
8f5fcb188c
Merge branch 'main' into lstein/model-manager-router-api
2023-07-05 23:16:43 -04:00
Lincoln Stein
f7daa6e71d
all methods now return OPENAPI_MODEL_CONFIGS; convert uses PUT
2023-07-05 23:13:01 -04:00
Lincoln Stein
a7cbcae176
expose max_cache_size to invokeai-configure interface
2023-07-05 20:59:57 -04:00
Lincoln Stein
0a6dccd607
expose max_cache_size to invokeai-configure interface
2023-07-05 20:59:14 -04:00
Lincoln Stein
43c51ff157
Merge branch 'main' into lstein/more-model-loading-fixes
2023-07-05 20:48:15 -04:00
Lincoln Stein
cfa3b2419c
partial implementation of merge
2023-07-05 20:25:47 -04:00
Lincoln Stein
d4550b3059
clean up lint errors in lora.py
2023-07-05 19:18:25 -04:00
Lincoln Stein
83d3a043da
merge latest changes from main
2023-07-05 19:15:53 -04:00
Lincoln Stein
71dad6d404
Merge branch 'main' into ti-ui
2023-07-05 16:57:31 -04:00
Lincoln Stein
685a47cc7d
fix crash during lora application
2023-07-05 16:40:47 -04:00
Lincoln Stein
f8bbec8572
Recognize and load diffusers-style LoRAs (.bin)
...
Prevent double-reporting of autoimported models
- closes #3636
Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:21:23 -04:00
Lincoln Stein
863336acbb
Recognize and load diffusers-style LoRAs (.bin)
...
Prevent double-reporting of autoimported models
- closes #3636
Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:19:16 -04:00
Lincoln Stein
90ae8ce26a
prevent model install crash "torch needs to be restarted with spawn"
2023-07-05 16:18:20 -04:00
Lincoln Stein
ad5d90aca8
prevent model install crash "torch needs to be restarted with spawn"
2023-07-05 15:38:07 -04:00
Lincoln Stein
5b6dd47b9f
add API for model convert
2023-07-05 15:13:21 -04:00
Lincoln Stein
5027d0a603
accept @psychedelicious suggestions above
2023-07-05 14:50:57 -04:00
Lincoln Stein
9f9ce08e44
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-05 13:38:33 -04:00
blessedcoolant
9e2d63ef97
Merge branch 'main' into fix/ckpt_convert_scan
2023-07-06 05:01:34 +12:00
Sergey Borisov
0ac9dca926
Fix loading diffusers ti
2023-07-05 19:46:00 +03:00
Lincoln Stein
bd82c4ace0
model installer confirms deletion of models
2023-07-05 09:57:23 -04:00
Lincoln Stein
9edf78dd2e
merge with main
2023-07-05 09:12:54 -04:00
Lincoln Stein
6112197edf
convert implemented; need router
2023-07-05 09:05:05 -04:00
Sergey Borisov
ee042ab76d
Fix ckpt scanning on conversion
2023-07-05 14:18:30 +03:00
Sergey Borisov
2beb8f049e
Fix model detection
2023-07-05 09:43:46 +03:00
blessedcoolant
639d88afd6
revert: inference_mode to no_grad
2023-07-05 16:39:15 +12:00
blessedcoolant
c0501ed5c2
fix: Slow loading of Loras
...
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-07-05 12:47:34 +10:00
Lincoln Stein
5d099f4a49
update_model working
2023-07-04 17:26:57 -04:00
Lincoln Stein
752b4d50cf
model_delete method now working
2023-07-04 10:40:32 -04:00
Lincoln Stein
c1c49d9a76
import model returns 404 for invalid path, 409 for duplicate model
2023-07-04 10:08:10 -04:00
Lincoln Stein
96bf92ead4
add the import model router
2023-07-04 14:35:47 +10:00
Lincoln Stein
fc419546bc
Merge branch 'main' into lstein/remove-hardcoded-cuda-device
2023-07-03 14:10:47 -04:00
Lincoln Stein
d6de11bd56
resolve merge conflict
2023-07-03 12:19:11 -04:00
Lincoln Stein
ed86d0b708
Union[foo, None]=>Optional[foo]
2023-07-03 12:17:45 -04:00
Lincoln Stein
fb2b2a371d
Merge branch 'lstein/fix-vae-conversion-crash' into release/invokeai-3-0-alpha
2023-07-03 11:21:16 -04:00
Lincoln Stein
10d513c5f7
add runtime root path to relative vaes and other submodels
2023-07-03 11:19:33 -04:00
Lincoln Stein
877b187a1b
Merge branch 'lstein/restore-3.9-compatibility' into release/invokeai-3-0-alpha
2023-07-03 11:01:34 -04:00
Lincoln Stein
ac9ec4e75a
restore 3.9 compatibility by replacing | with Union[]
2023-07-03 10:57:40 -04:00
Lincoln Stein
2465c7987b
Revert "restore 3.9 compatibility by replacing | with Union[]"
...
This reverts commit 76bafeb99e
.
2023-07-03 10:56:41 -04:00
Lincoln Stein
76bafeb99e
restore 3.9 compatibility by replacing | with Union[]
2023-07-03 10:55:04 -04:00
Lincoln Stein
6935858ef3
add debugging messages to aid in memory leak tracking
2023-07-02 13:34:53 -04:00
Lincoln Stein
3c2ce51f10
Merge branch 'lstein/remove-hardcoded-cuda-device' into release/invokeai-3-0-alpha
2023-07-01 21:15:58 -04:00
Lincoln Stein
0f02915012
remove hardcoded cuda device in model manager init
2023-07-01 21:15:42 -04:00
Lincoln Stein
f1928d2588
prevent crashes on malformed models
2023-07-01 14:32:58 -04:00
blessedcoolant
c74bb5cdbf
Merge branch 'main' into lstein/fix-vae-convert
2023-07-01 11:18:21 +12:00
Lincoln Stein
1347fc2f00
fix incorrect VAE config file path during conversion of ckpts
2023-06-30 19:14:06 -04:00
Lincoln Stein
ace4f6d586
fix duplicate model key addition when root directory is a relative path
2023-06-28 17:02:03 -04:00
StAlKeR7779
ac46b129bf
Merge branch 'main' into feat/lora_model_patch
2023-06-28 22:43:58 +03:00
Lincoln Stein
79fc708580
warn but do not crash when model scan finds random cruft in models
directory
2023-06-28 15:26:42 -04:00
Lincoln Stein
e8ed0fad6c
autoimport from embedding/controlnet/lora folders designated in startup file
2023-06-27 12:30:53 -04:00
Lincoln Stein
823e098b7c
prompt user for prediction type when autoimporting a v2 model without .yaml file
...
don't ask user for prediction type of a config.yaml provided
2023-06-26 16:30:34 -04:00
Lincoln Stein
011adfc958
merge with main
2023-06-26 13:53:59 -04:00
Lincoln Stein
befd95eb19
rename root_dir to root_path attributes to emphasize return of a Path
2023-06-26 13:52:25 -04:00
Lincoln Stein
a2ddb3823b
fix add_model() logic
2023-06-26 13:33:38 -04:00
Sergey Borisov
91c3a58fb6
Fix lycoris layers init
2023-06-26 04:33:37 +03:00
Sergey Borisov
5cebf67ee4
Apply lora by patching lora instead of hooks
2023-06-26 03:57:33 +03:00
Sergey Borisov
1ba94a92b3
Fixes
2023-06-26 03:54:42 +03:00
Sergey Borisov
23c22ac933
Refactor logic/small fixes
2023-06-26 03:07:54 +03:00
Lincoln Stein
160b5d7992
add support for an autoimport models directory scanned at startup time
2023-06-25 18:50:15 -04:00
Lincoln Stein
c91d1eacba
Merge branch 'lstein/installer-for-new-model-layout' of github.com:invoke-ai/InvokeAI into lstein/installer-for-new-model-layout
2023-06-25 16:04:48 -04:00
Lincoln Stein
60b37b7ff4
fix model manager documentation
2023-06-25 16:04:43 -04:00
Sergey Borisov
a3c22b5fe6
Remove upcast_attention and prediction_type from stable diffusion model logic, fix ckpt conversion according to this
2023-06-25 21:06:22 +03:00
Lincoln Stein
c3c4a71173
implemented Stalker's suggested improvements
2023-06-24 12:37:26 -04:00
Lincoln Stein
ba1371a88f
rename ModelType.Pipeline to ModelType.Main
2023-06-24 11:45:49 -04:00
Lincoln Stein
539d1f3bde
remove redundant prediction_type and attention_upscaling flags
2023-06-23 16:54:52 -04:00
Lincoln Stein
466ec3ab5e
add router API support for model manager heuristic_import()`
2023-06-23 16:35:39 -04:00
Lincoln Stein
58d1857ab6
merge with main
2023-06-23 13:57:25 -04:00
Lincoln Stein
56bd873d7a
make relative model paths work in model manager
2023-06-23 10:52:59 -04:00
Sergey Borisov
5aaaaf64a1
Fix ckpt conversion
2023-06-23 17:29:54 +03:00
StAlKeR7779
9140e2c0f2
Merge branch 'main' into fix/vae_conversion
2023-06-23 15:03:59 +03:00
Lincoln Stein
c7b7e087e4
Merge branch 'main' into lstein/installer-for-new-model-layout
2023-06-23 01:45:05 +01:00
blessedcoolant
bb85608890
Merge branch 'main' into feat/onnx
2023-06-23 05:18:41 +12:00
Sergey Borisov
6c7668aaca
Update onnx model structure, change code according
2023-06-22 20:03:17 +03:00
psychedelicious
b937b7da01
feat(models): update model manager service & route to return list of models
2023-06-22 17:34:12 +10:00
Sergey Borisov
21245a0fb2
Set model type to const value in openapi schema, add model format enums to model schema(as they not not referenced in case of Literal definition)
2023-06-22 16:51:53 +10:00
Sergey Borisov
da566b59e8
Update model format field to use enums
2023-06-22 16:51:53 +10:00
Sergey Borisov
e4dc9c5a04
Rename format to model_format(still named format when work with config)
2023-06-22 16:51:53 +10:00
Sergey Borisov
aceadacad4
Remove default model logic
2023-06-22 16:51:53 +10:00
blessedcoolant
727293d722
fix: 2.1 models breaking generation
...
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-06-22 16:42:59 +10:00
Sergey Borisov
ef83a2fffe
Add name, base_mode, type fields to model info
2023-06-22 16:42:51 +10:00
Sergey Borisov
01d17601b8
Generate config names for openapi
2023-06-22 16:41:19 +10:00
blessedcoolant
bf0d5f4cfc
fix: Update missing name types to new names
2023-06-22 16:41:02 +10:00
blessedcoolant
9838dda1b7
chore: Update model config type names
2023-06-22 16:40:40 +10:00
Sergey Borisov
7759b3f75a
Small refactor
2023-06-21 04:24:25 +03:00
Sergey Borisov
4d337f6abc
ONNX Model/runtime first implementation
2023-06-21 02:12:21 +03:00
Lincoln Stein
90df316835
Merge branch 'main' into lstein/installer-for-new-model-layout
2023-06-20 22:50:41 +01:00
Lincoln Stein
ac6403f877
address some of ebr issues
2023-06-20 11:08:27 -04:00
Sergey Borisov
92c86fd0b8
Set model type to const value in openapi schema, add model format enums to model schema(as they not not referenced in case of Literal definition)
2023-06-20 03:44:58 +03:00
Sergey Borisov
46dc751139
Update model format field to use enums
2023-06-20 03:30:09 +03:00
Sergey Borisov
4cefe37723
Rename format to model_format(still named format when work with config)
2023-06-20 03:25:08 +03:00
Sergey Borisov
82b73c50a0
Remove default model logic
2023-06-20 03:13:10 +03:00
blessedcoolant
b0c4451324
Merge branch 'main' into model-manager-ui-30
2023-06-19 23:02:59 +12:00
Sergey Borisov
7b35162b9e
Remove old logic except for inpaint, add support for lora and ti to inpaint node
2023-06-19 15:57:28 +10:00
Sergey Borisov
82091b9a66
Fix vae conversion
2023-06-18 23:46:07 +03:00
Lincoln Stein
e1d53b86f3
Merge branch 'main' into lstein/installer-for-new-model-layout
2023-06-17 16:26:56 -07:00
blessedcoolant
bf0577c882
fix: 2.1 models breaking generation
...
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-06-18 08:26:25 +12:00
Sergey Borisov
dc669d1447
Add name, base_mode, type fields to model info
2023-06-17 22:48:44 +03:00
Sergey Borisov
16dc78f6c6
Generate config names for openapi
2023-06-17 17:15:36 +03:00
blessedcoolant
c8dfa49d86
fix: Update missing name types to new names
2023-06-17 22:04:28 +12:00
blessedcoolant
67d05d2066
chore: Update model config type names
2023-06-17 21:28:43 +12:00
Lincoln Stein
f28d50070e
configure/install basically working; needs edge case testing
2023-06-16 22:54:36 -04:00
Lincoln Stein
ada7399753
rewrite of widget display - marshalling needs rewrite
2023-06-15 23:32:33 -04:00
Sergey Borisov
5f2d07917d
Fix lora import, fix sd2 config, fix list models api
2023-06-15 21:30:15 +03:00
Sergey Borisov
6c5954f9d1
Add controlnet to model manager, fixes
2023-06-14 04:26:21 +03:00
Sergey Borisov
740c05a0bb
Save models on rescan, uncache model on edit/delete, fixes
2023-06-14 03:12:12 +03:00
Sergey Borisov
26090011c4
Fix conflict resolve, add model configs to type annotation
2023-06-14 00:26:37 +03:00
Sergey Borisov
e7db6d8120
Fix ckpt and vae conversion, migrate script, remove sd2-base
2023-06-13 18:05:12 +03:00
Lincoln Stein
87ba17a1f5
add migration script and update convert and face restoration paths
2023-06-13 01:27:51 -04:00
Lincoln Stein
1439dc7712
Add SchedulerPredictionType and ModelVariantType enums
2023-06-12 16:07:04 -04:00
Sergey Borisov
36eb1bd893
Fixes
2023-06-12 16:14:09 +03:00
Sergey Borisov
9fa78443de
Fixes, add sd variant detection
2023-06-12 05:52:30 +03:00
Lincoln Stein
893f776f1d
model_probe working; model_install incomplete
2023-06-11 19:51:53 -04:00
Lincoln Stein
085ab54124
remove modified models.py and migrate code to models/base.py
2023-06-11 16:10:15 -04:00
Lincoln Stein
8e1a56875e
remove defunct code
2023-06-11 12:57:06 -04:00
Lincoln Stein
000626ab2e
move all installation code out of model_manager
2023-06-11 12:51:50 -04:00
Sergey Borisov
694fd0c92f
Fixes, first runable version
2023-06-11 16:42:40 +03:00
Sergey Borisov
738ba40f51
Fixes
2023-06-11 06:12:21 +03:00
Sergey Borisov
3ce3a7ee72
Rewrite model configs, separate models
2023-06-11 04:49:09 +03:00
Lincoln Stein
74b43c9bdf
fix incorrect variable/typenames in model_cache
2023-06-10 10:41:48 -04:00
Lincoln Stein
a87d52a389
resolve conflicts between lstein & sttalker changes
2023-06-10 09:59:19 -04:00
Lincoln Stein
959e64c9b3
start removing repo_id support
2023-06-10 09:57:23 -04:00
Sergey Borisov
2c056ead42
New models structure draft
2023-06-10 03:14:10 +03:00
Lincoln Stein
887576d217
add directory scanning for loras, controlnets and textual_inversions
2023-06-08 23:11:53 -04:00
Lincoln Stein
6652f3405b
merge with main
2023-06-08 21:08:43 -04:00
Lincoln Stein
2a6d11e645
create databases directory on startup
2023-06-08 07:17:54 -04:00
Lincoln Stein
9ed86a08f1
multiple small fixes
...
1. Contents of autoscan directory field are restored after doing an installation.
2. Activate dialogue to choose V2 parameterization when importing from a directory.
3. Remove autoscan directory from init file when its checkbox is unselected.
4. Add widget cycling behavior to install models form.
2023-06-07 17:32:00 -04:00
Lincoln Stein
04f9757f8d
prevent crash when trying to calculate size of missing safety_checker
...
- Also fixed up order in which logger is created in invokeai-web
so that handlers are installed after command-line options are
parsed (and not before!)
2023-06-06 22:57:49 -04:00
Lincoln Stein
1f9e1eb964
merge with main
2023-06-06 22:18:41 -04:00
Lincoln Stein
d8d11f9bbb
quench fp16 rev id not found warning
2023-06-06 22:01:05 -04:00
Lincoln Stein
f5044c290d
fix crash during model conversion
2023-06-06 17:05:29 -04:00
Lincoln Stein
90333c0074
merge with main
2023-06-05 22:03:44 -04:00
Lincoln Stein
cb157ea530
fix crash when install-models launched from config script
2023-06-04 14:55:51 -04:00
Lincoln Stein
1a7fb601dc
ask user for v2 variant when model manager can't infer it
2023-06-04 11:27:44 -04:00
Lincoln Stein
31e97ead2a
move invokeai.db to ~/invokeai/databases
...
- The invokeai.db database file has now been moved into
`INVOKEAIROOT/databases`. Using plural here for possible
future with more than one database file.
- Removed a few dangling debug messages that appeared during
testing.
- Rebuilt frontend to test web.
2023-06-03 20:25:34 -04:00
Lincoln Stein
72d1e4e404
fix bug in model_manager that prevented import of inpainting models
2023-06-02 22:39:26 -04:00
Lincoln Stein
1390b65a9c
new TUI is fully functional; needs some polishing
2023-06-02 17:20:50 -04:00
Lincoln Stein
41f7758977
listing, downloading and deleting LoRAs working; TI support pending
2023-06-02 00:40:15 -04:00
Lincoln Stein
e9821ab711
implemented tabbed model selection; not wired to backend yet
2023-06-01 00:31:46 -04:00
Sergey Borisov
b47786e846
First working TI draft
2023-05-31 02:12:27 +03:00
Sergey Borisov
69ccd3a0b5
Fixes for checkpoint models
2023-05-30 19:12:47 +03:00
Sergey Borisov
79de9047b5
First working lora implementation
2023-05-30 01:11:00 +03:00
Lincoln Stein
2273b3a8c8
fix potential race condition in config system
2023-05-25 20:41:26 -04:00
Sergey Borisov
8e419a4f97
Revert weak references as can be done without it
2023-05-23 04:29:40 +03:00
Sergey Borisov
2533209326
Rewrite cache to weak references
2023-05-23 03:48:22 +03:00
Lincoln Stein
d2dc1ed26f
make InvokeAI package installable
...
This commit makes InvokeAI 3.0 to be installable via PyPi.org and the
installer script.
Main changes.
1. Move static web pages into `invokeai/frontend/web` and modify the
API to look for them there. This allows pip to copy the files into the
distribution directory so that user no longer has to be in repo root
to launch.
2. Update invoke.sh and invoke.bat to launch the new web application
properly. This also changes the wording for launching the CLI from
"generate images" to "explore the InvokeAI node system," since I would
not recommend using the CLI to generate images routinely.
3. Fix a bug in the checkpoint converter script that was identified
during testing.
4. Better error reporting when checkpoint converter fails.
5. Rebuild front end.
2023-05-22 17:51:47 -04:00
Lincoln Stein
27241cdde1
port more globals changes over
2023-05-18 17:17:45 -04:00
Lincoln Stein
259d6ec90d
fixup cachedir call
2023-05-18 14:52:16 -04:00
Lincoln Stein
a77c4c87b2
fixed logic error in resolution of model path
2023-05-18 14:35:34 -04:00
Lincoln Stein
d96175d127
resolve some undefined symbols in model_cache
2023-05-18 14:31:47 -04:00
Lincoln Stein
b1a99d772c
added method to convert vaes
2023-05-18 13:31:11 -04:00
Lincoln Stein
7ea995149e
fixes to env parsing, textual inversion & help text
...
- Make environment variable settings case InSenSiTive:
INVOKEAI_MAX_LOADED_MODELS and InvokeAI_Max_Loaded_Models
environment variables will both set `max_loaded_models`
- Updated realesrgan to use new config system.
- Updated textual_inversion_training to use new config system.
- Discovered a race condition when InvokeAIAppConfig is created
at module load time, which makes it impossible to customize
or replace the help message produced with --help on the command
line. To fix this, moved all instances of get_invokeai_config()
from module load time to object initialization time. Makes code
cleaner, too.
- Added `--from_file` argument to `invokeai-node-cli` and changed
github action to match. CI tests will hopefully work now.
2023-05-18 10:48:23 -04:00
Sergey Borisov
fd82763412
Model manager draft
2023-05-18 03:56:52 +03:00
Lincoln Stein
e971a7f35c
when migrating models.yaml, rename original models.yaml.orig
2023-05-16 22:37:53 -04:00
Lincoln Stein
cd16857f38
fix None in model_type
2023-05-16 00:13:44 -04:00
Lincoln Stein
1442f1cb8d
change model filter to None in second place
2023-05-16 00:03:57 -04:00
Lincoln Stein
4fe94a9315
list_models() now returns a dict of {type,{name: info}}
2023-05-15 23:44:08 -04:00
Lincoln Stein
c8f765cc06
improve debugging messages
2023-05-14 18:29:55 -04:00
Lincoln Stein
b9e9087dbe
do not manage GPU for pipelines if sequential_offloading is True
2023-05-14 18:09:38 -04:00
Lincoln Stein
63e465eb5c
tweaks to get_model()
behavior
...
1. If an external VAE is specified in config file, then
get_model(submodel=vae) will return the external VAE, not the one
burnt into the parent diffusers pipeline.
2. The mechanism in (1) is generalized such that you can now have
"unet:", "text_encoder:" and similar stanzas in the config file.
Valid formats of these subsections:
unet:
repo_id: foo/bar
unet:
path: /path/to/local/folder
unet:
repo_id: foo/bar
subfolder: unet
In the near future, these will also be used to attach external
parts to the pipeline, generalizing VAE behavior.
3. Accommodate callers (i.e. the WebUI) that are passing the
model key ("diffusers/stable-diffusion-1.5") to get_model()
instead of the tuple of model_name and model_type.
4. Fixed bug in VAE model attaching code.
5. Rebuilt web front end.
2023-05-14 16:50:59 -04:00
Lincoln Stein
baf5451fa0
Merge branch 'main' into lstein/new-model-manager
2023-05-13 22:01:34 -04:00
Lincoln Stein
1103ab2844
merge with main
2023-05-13 21:35:19 -04:00
Lincoln Stein
b31a6ff605
fix reversed args in _model_key() call
2023-05-13 21:11:06 -04:00
Sergey Borisov
1f602e6143
Fix - apply precision to text_encoder
2023-05-14 03:46:13 +03:00
Sergey Borisov
039fa73269
Change SDModelType enum to string, fixes(model unload negative locks count, scheduler load error, saftensors convert, wrong logic in del_model, wrong parse metadata in web)
2023-05-14 03:06:26 +03:00
Lincoln Stein
2204e47596
allow submodels to be fetched independent of parent pipeline
2023-05-13 16:54:47 -04:00