Commit Graph

1623 Commits

Author SHA1 Message Date
04229082d6 Provide ti name from model manager, not from ti itself 2023-08-04 18:24:47 -04:00
f0613bb0ef Fix merge conflict resolve - restore full/diff layer support 2023-08-04 19:53:27 +03:00
0e9f92b868 Merge branch 'main' into feat/sdxl_lora 2023-08-04 19:22:13 +03:00
7d0cc6ec3f chore: black 2023-08-05 02:04:22 +10:00
2f8b928486 Add support for diff/full lora layers 2023-08-05 02:04:22 +10:00
1d5d187ba1 model probe detects sdxl lora models 2023-08-04 11:44:56 -04:00
1ac14a1e43 add sdxl lora support 2023-08-04 11:44:56 -04:00
1deca89fde Merge branch 'main' into feat/select-vram-in-config 2023-08-03 19:27:58 -04:00
446fb4a438 blackify 2023-08-03 19:24:23 -04:00
ab5d938a1d use variant instead of revision 2023-08-03 19:23:52 -04:00
9942af756a Merge branch 'main' into remove-onnx-model-check-from-pipeline-download 2023-08-03 10:10:51 -04:00
91ebf9f76e Merge branch 'main' into refactor/model_manager_instantiate 2023-08-02 19:01:21 -07:00
e080fd1e08 blackify 2023-08-03 11:25:20 +10:00
eeef1e08f8 restore ability to convert merged inpaint .safetensors files 2023-08-03 11:25:20 +10:00
02d2cc758d Merge branch 'main' into refactor/model_manager_instantiate 2023-08-02 17:11:23 -07:00
4e0949fa55 fix .swap() by reverting improperly merged @classmethod change 2023-08-03 10:00:43 +10:00
ec48779080 blackify 2023-08-02 14:28:19 -04:00
5de42be4a6 reduce VRAM cache default; take max RAM from system 2023-08-02 14:27:13 -04:00
29ac252501 blackify 2023-08-02 09:44:06 -04:00
880727436c fix default vram cache size calculation 2023-08-02 09:43:52 -04:00
77c5c18542 add slider for VRAM cache 2023-08-02 09:11:24 -04:00
ed76250dba Stop checking for unet/model.onnx when a model_index.json is detected 2023-08-02 07:21:21 -04:00
4d22cafdad Installer should download fp16 models if user has specified 'auto' in config
- Closes #4127
2023-08-01 22:06:27 -04:00
5998509888 Merge branch 'main' into refactor/model_manager_instantiate 2023-08-01 11:09:43 -07:00
746afcd235 Merge branch 'main' into feat/onnx 2023-07-31 16:56:34 -04:00
aeac557c41 Run python black, point out that onnx is an alpha feature in the installer 2023-07-31 16:47:48 -04:00
bacdf985f1 doc(model_manager): docstrings 2023-07-31 09:16:32 -07:00
f5ac73b091 Merge branch 'main' into feat/onnx 2023-07-31 10:58:40 -04:00
2c07f54b6e Merge branch 'main' into fix-optional 2023-07-31 16:31:01 +10:00
adfd1e52f4 refactor(model_manager): avoid copy/paste logic 2023-07-30 11:53:12 -07:00
0e48c98330 Merge remote-tracking branch 'origin/main' into refactor/model_manager_instantiate
# Conflicts:
#	invokeai/backend/model_management/model_manager.py
2023-07-30 11:33:13 -07:00
83d3f2347e fix "unrecognized arguments: --yes" bug on unattended upgrade 2023-07-30 11:07:06 -04:00
50e00feceb Add missing Optional on a few nullable fields. 2023-07-30 16:25:12 +02:00
2537ff0280 Merge branch 'main' into bugfix/model-manager-rel-paths 2023-07-30 08:17:36 -04:00
e20c4dc1e8 blackified 2023-07-30 08:17:10 -04:00
ac84a9f915 reenable display of autoloaded models 2023-07-30 08:05:05 -04:00
844578ab88 fix lora loading crash 2023-07-30 07:57:10 -04:00
ff1c40747e lint: formatting 2023-07-29 20:02:31 -07:00
73f3b7f84b remove dangling comment 2023-07-29 17:32:33 -04:00
348bee8981 blackified 2023-07-29 17:30:54 -04:00
e82eb0b9fc add correct optional annotation to precision arg 2023-07-29 17:30:21 -04:00
1de783b1ce fix mistake in indexing flat_ema_key 2023-07-29 17:20:26 -04:00
3f9105be50 make convert script respect setting of use_ema in config file 2023-07-29 17:17:45 -04:00
2a2d988928 convert script handles more ckpt variants 2023-07-29 15:28:39 -04:00
ccceb32a85 lint: formatting 2023-07-29 11:50:04 -07:00
72c519c6ad fix incorrect key construction 2023-07-29 13:51:47 -04:00
24b19166dd further refactoring 2023-07-29 13:13:22 -04:00
0fb7328022 blackify code 2023-07-29 13:00:43 -04:00
99daa97978 more refactoring; fixed place where rel conversion missed 2023-07-29 13:00:07 -04:00
982a568349 blackify pr 2023-07-29 10:47:55 -04:00