Commit Graph

1036 Commits

Author SHA1 Message Date
Kevin Turner
44bf308192 test(model_management): add a couple tests for _get_model_path 2023-08-05 15:22:23 -07:00
Kevin Turner
65ed224bfc
Merge branch 'main' into refactor/model_manager_instantiate 2023-08-04 21:34:38 -07:00
psychedelicious
b6e369c745 chore: black 2023-08-05 12:28:35 +10:00
gogurtenjoyer
ecabfc252b devices.py - Update MPS FP16 check to account for upcoming MacOS Sonoma
float16 doesn't seem to work on MacOS Sonoma due to further changes with Metal. This'll default back to float32 for Sonoma users.
2023-08-05 12:28:35 +10:00
psychedelicious
da96a41103
Merge branch 'main' into feat/select-vram-in-config 2023-08-05 12:11:50 +10:00
psychedelicious
eb6c317f04 chore: black 2023-08-05 12:05:24 +10:00
psychedelicious
6d7223238f fix: fix typo in message 2023-08-05 12:05:24 +10:00
Damian Stewart
8607d124c5 improve message about the consequences of the --ignore_missing_core_models flag 2023-08-05 12:05:24 +10:00
Damian Stewart
23497bf759 add --ignore_missing_core_models CLI flag to bypass checking for missing core models 2023-08-05 12:05:24 +10:00
Kevin Turner
b10cf20eb1 Merge branch 'main' into refactor/model_manager_instantiate
# Conflicts:
#	invokeai/backend/model_management/model_manager.py
2023-08-04 18:28:18 -07:00
StAlKeR7779
9bacd77a79
Merge branch 'main' into bugfix/fp16-models 2023-08-05 01:42:43 +03:00
Lincoln Stein
1b158f62c4 resolve vae overrides correctly 2023-08-04 18:24:47 -04:00
Lincoln Stein
6ad565d84c folded in changes from 4099 2023-08-04 18:24:47 -04:00
Sergey Borisov
04229082d6 Provide ti name from model manager, not from ti itself 2023-08-04 18:24:47 -04:00
Sergey Borisov
f0613bb0ef Fix merge conflict resolve - restore full/diff layer support 2023-08-04 19:53:27 +03:00
StAlKeR7779
0e9f92b868
Merge branch 'main' into feat/sdxl_lora 2023-08-04 19:22:13 +03:00
psychedelicious
7d0cc6ec3f chore: black 2023-08-05 02:04:22 +10:00
Sergey Borisov
2f8b928486 Add support for diff/full lora layers 2023-08-05 02:04:22 +10:00
Lincoln Stein
1d5d187ba1 model probe detects sdxl lora models 2023-08-04 11:44:56 -04:00
Sergey Borisov
1ac14a1e43 add sdxl lora support 2023-08-04 11:44:56 -04:00
Lincoln Stein
1deca89fde
Merge branch 'main' into feat/select-vram-in-config 2023-08-03 19:27:58 -04:00
Lincoln Stein
446fb4a438 blackify 2023-08-03 19:24:23 -04:00
Lincoln Stein
ab5d938a1d use variant instead of revision 2023-08-03 19:23:52 -04:00
Brandon
9942af756a
Merge branch 'main' into remove-onnx-model-check-from-pipeline-download 2023-08-03 10:10:51 -04:00
Kevin Turner
91ebf9f76e
Merge branch 'main' into refactor/model_manager_instantiate 2023-08-02 19:01:21 -07:00
Lincoln Stein
e080fd1e08 blackify 2023-08-03 11:25:20 +10:00
Lincoln Stein
eeef1e08f8 restore ability to convert merged inpaint .safetensors files 2023-08-03 11:25:20 +10:00
Kevin Turner
02d2cc758d
Merge branch 'main' into refactor/model_manager_instantiate 2023-08-02 17:11:23 -07:00
Damian Stewart
4e0949fa55 fix .swap() by reverting improperly merged @classmethod change 2023-08-03 10:00:43 +10:00
Lincoln Stein
ec48779080 blackify 2023-08-02 14:28:19 -04:00
Lincoln Stein
5de42be4a6 reduce VRAM cache default; take max RAM from system 2023-08-02 14:27:13 -04:00
Lincoln Stein
29ac252501 blackify 2023-08-02 09:44:06 -04:00
Lincoln Stein
880727436c fix default vram cache size calculation 2023-08-02 09:43:52 -04:00
Lincoln Stein
77c5c18542 add slider for VRAM cache 2023-08-02 09:11:24 -04:00
Brandon Rising
ed76250dba Stop checking for unet/model.onnx when a model_index.json is detected 2023-08-02 07:21:21 -04:00
Lincoln Stein
4d22cafdad Installer should download fp16 models if user has specified 'auto' in config
- Closes #4127
2023-08-01 22:06:27 -04:00
Kevin Turner
5998509888
Merge branch 'main' into refactor/model_manager_instantiate 2023-08-01 11:09:43 -07:00
Kent Keirsey
746afcd235
Merge branch 'main' into feat/onnx 2023-07-31 16:56:34 -04:00
Brandon Rising
aeac557c41 Run python black, point out that onnx is an alpha feature in the installer 2023-07-31 16:47:48 -04:00
Kevin Turner
bacdf985f1 doc(model_manager): docstrings 2023-07-31 09:16:32 -07:00
Brandon Rising
f5ac73b091 Merge branch 'main' into feat/onnx 2023-07-31 10:58:40 -04:00
psychedelicious
2c07f54b6e
Merge branch 'main' into fix-optional 2023-07-31 16:31:01 +10:00
Kevin Turner
adfd1e52f4 refactor(model_manager): avoid copy/paste logic 2023-07-30 11:53:12 -07:00
Kevin Turner
0e48c98330 Merge remote-tracking branch 'origin/main' into refactor/model_manager_instantiate
# Conflicts:
#	invokeai/backend/model_management/model_manager.py
2023-07-30 11:33:13 -07:00
Lincoln Stein
83d3f2347e fix "unrecognized arguments: --yes" bug on unattended upgrade 2023-07-30 11:07:06 -04:00
Alexandre Macabies
50e00feceb Add missing Optional on a few nullable fields. 2023-07-30 16:25:12 +02:00
Lincoln Stein
2537ff0280
Merge branch 'main' into bugfix/model-manager-rel-paths 2023-07-30 08:17:36 -04:00
Lincoln Stein
e20c4dc1e8 blackified 2023-07-30 08:17:10 -04:00
Lincoln Stein
ac84a9f915 reenable display of autoloaded models 2023-07-30 08:05:05 -04:00
Lincoln Stein
844578ab88 fix lora loading crash 2023-07-30 07:57:10 -04:00
Kevin Turner
ff1c40747e lint: formatting 2023-07-29 20:02:31 -07:00
Lincoln Stein
73f3b7f84b remove dangling comment 2023-07-29 17:32:33 -04:00
Lincoln Stein
348bee8981 blackified 2023-07-29 17:30:54 -04:00
Lincoln Stein
e82eb0b9fc add correct optional annotation to precision arg 2023-07-29 17:30:21 -04:00
Lincoln Stein
1de783b1ce fix mistake in indexing flat_ema_key 2023-07-29 17:20:26 -04:00
Lincoln Stein
3f9105be50 make convert script respect setting of use_ema in config file 2023-07-29 17:17:45 -04:00
Lincoln Stein
2a2d988928 convert script handles more ckpt variants 2023-07-29 15:28:39 -04:00
Kevin Turner
ccceb32a85 lint: formatting 2023-07-29 11:50:04 -07:00
Lincoln Stein
72c519c6ad fix incorrect key construction 2023-07-29 13:51:47 -04:00
Lincoln Stein
24b19166dd further refactoring 2023-07-29 13:13:22 -04:00
Lincoln Stein
0fb7328022 blackify code 2023-07-29 13:00:43 -04:00
Lincoln Stein
99daa97978 more refactoring; fixed place where rel conversion missed 2023-07-29 13:00:07 -04:00
Lincoln Stein
982a568349 blackify pr 2023-07-29 10:47:55 -04:00
Lincoln Stein
d79d5a4ff7 modest refactoring 2023-07-29 10:45:26 -04:00
Lincoln Stein
9968ff2893 fix relative model paths to be against config.models_path, not root 2023-07-29 10:30:27 -04:00
Kevin Turner
86b8b69e88 internal(ModelManager): add instantiate method 2023-07-28 22:30:25 -07:00
Kevin Turner
bc9a5038fd refactor(ModelManager): factor out get_model_path 2023-07-28 22:29:36 -07:00
Kevin Turner
b163ae6a4d refactor(ModelManager): factor out get_model_config 2023-07-28 21:30:20 -07:00
Kevin Turner
dca685ac25 refactor(ModelManager): refactor rescan-on-miss to exists() method 2023-07-28 21:11:00 -07:00
Kevin Turner
e70bedba7d refactor(ModelManager): factor out _get_implementation method 2023-07-28 21:03:27 -07:00
Brandon Rising
390ce9f249 Fix onnx installer 2023-07-28 16:54:03 -04:00
Brandon Rising
a2aa66f43a Run Python black 2023-07-28 10:00:09 -04:00
Brandon Rising
da751da3dd Merge branch 'main' into feat/onnx 2023-07-28 09:59:35 -04:00
Brandon Rising
2b7b3dd4ba Run python black 2023-07-28 09:46:44 -04:00
Lincoln Stein
4c79350300 persist LoRA model info in models.yaml 2023-07-28 11:38:52 +10:00
Brandon Rising
1ea9ba84f5 Release session if applying ti or lora 2023-07-27 15:20:38 -04:00
Lincoln Stein
0d8f9cbe55 resolved conflicts with main 2023-07-27 15:11:25 -04:00
Brandon Rising
bfdc8c80f3 Testing caching onnx sessions 2023-07-27 14:13:29 -04:00
Brandon Rising
59716938bf Remove TensorRT support at the current time until we validate it works, remove time step recorder 2023-07-27 11:18:50 -04:00
Martin Kristiansen
218b6d0546 Apply black 2023-07-27 10:54:01 -04:00
Brandon Rising
f7bb4c3f05 Remove more files no longer needed in main 2023-07-27 10:49:43 -04:00
Brandon Rising
989d3d7f3c Remove onnx changes from canvas img2img, inpaint, and linear image2image 2023-07-27 10:08:45 -04:00
Brandon Rising
eb1ba8d74b Merge branch 'main' into feat/onnx 2023-07-27 09:54:30 -04:00
Brandon Rising
024f92f9a9 Add onnx models to the model manager UI 2023-07-27 09:37:37 -04:00
Lincoln Stein
b67041dd29
Merge branch 'main' into bugfix/convert-sdxl-models 2023-07-27 00:24:37 -04:00
Lincoln Stein
c02b9db064
Merge branch 'main' into bugfix/convert-sdxl-models 2023-07-27 00:08:15 -04:00
Lincoln Stein
2e19b23eed
Merge branch 'main' into feat/install-finetune-sdxl-vae 2023-07-27 00:06:00 -04:00
Lincoln Stein
61aff8540c fix refiner conversion 2023-07-27 00:02:10 -04:00
Lincoln Stein
2b7807e6a0
Merge branch 'main' into fix/yaml-file-delete 2023-07-26 23:45:43 -04:00
Lincoln Stein
77946bfea5 restore ability to convert SDXL checkpoints to diffusers 2023-07-26 23:28:58 -04:00
Lincoln Stein
d4d4d749f2 Merge branch 'release/invokeai-3-0-1' 2023-07-26 23:15:26 -04:00
Lincoln Stein
43fe8b1dda
Merge branch 'main' into fix/reduce-configure-vertical 2023-07-26 23:12:25 -04:00
Lincoln Stein
9c4acb9d3f install SDXL "fixed" VAE 2023-07-26 23:06:27 -04:00
Lincoln Stein
451b8c96e5 do not overwrite models.yaml if it is well formed 2023-07-26 22:29:39 -04:00
Lincoln Stein
83a981b585 merge with main; fix SDXL repo_ids 2023-07-26 17:38:06 -04:00
Lincoln Stein
049645d66e updated LICENSE files and added information about watermarking 2023-07-26 17:27:33 -04:00
Lincoln Stein
0100ac8f2d Merge branch 'main' into release/invokeai-3-0-1 2023-07-26 15:27:06 -04:00
Lincoln Stein
020031f376 add all legacy model .yaml files to configs directory unconditionally 2023-07-26 15:17:00 -04:00
Lincoln Stein
bf1f6619df fix conversion for sd1 and sd2 models 2023-07-26 15:02:32 -04:00
Brandon Rising
861c0fe76b Correct issues caused by merging main 2023-07-26 12:25:46 -04:00
Lincoln Stein
af8fc6ff82 final polish before release candidate
- Fix issue that prevented web ui from starting if
  ROOT/databases/invokeai.db not found.

- Rebuild front end
2023-07-26 10:59:23 -04:00
Brandon Rising
c16da75ac7 Merge branch 'main' into feat/onnx 2023-07-26 10:42:31 -04:00
Lincoln Stein
c7f883d22a
Merge branch 'main' into patch 2023-07-26 10:19:02 -04:00
Lincoln Stein
4bea846199
Merge branch 'main' into feat/safety-checker-node 2023-07-26 10:04:23 -04:00
Lincoln Stein
58c0bee325 improved error message for running configure 2023-07-26 08:30:01 -04:00
Lincoln Stein
b8f43f444a implemented startup sanity checks on core models 2023-07-26 08:26:29 -04:00
Lincoln Stein
da76f6fee4 compress height needed by configure script 2023-07-26 08:00:19 -04:00
Lincoln Stein
3e206d4d6a removed nsfw/watermark from invokeai.yaml 2023-07-26 06:53:35 -04:00
Lincoln Stein
85ad5ef204 refactored code; added watermark and nsfw facilities to app config route 2023-07-26 15:27:04 +10:00
Lincoln Stein
e32cd794f7 add safetychecker and watermark nodes 2023-07-26 15:26:45 +10:00
Lincoln Stein
dbc3d42afc install all recommended models with --yes; don't alter starter model screen 2023-07-25 22:24:03 -04:00
Lincoln Stein
2db9b3b2ae
Merge branch 'main' into patch 2023-07-25 16:27:10 -04:00
Lincoln Stein
e43e198102 rework configure/install TUI to require less space 2023-07-25 11:25:26 -04:00
Lincoln Stein
2aefa921fe fix "unknown model type" error when rebasing a model with API
- Add command-line model probing script for dev use
- Minor documentation tweak
2023-07-25 08:36:57 -04:00
Lincoln Stein
11e6ecc1bf
Merge branch 'main' into feat/controlnet-and-sdxl-convert 2023-07-25 08:05:17 -04:00
Lincoln Stein
fc4e104c61 tested on 3.11 and 3.10 2023-07-24 17:13:32 -04:00
blessedcoolant
d6bf6513ef
Merge branch 'main' into fix-types-2 2023-07-24 20:01:48 +12:00
camenduru
cbb90cbdbb
Download all model types. 2023-07-24 10:59:59 +03:00
blessedcoolant
0cf7a10c5c fix: Other lora missing type 2023-07-24 18:58:24 +12:00
Alexandre Macabies
0beec08d38 Add missing import. 2023-07-23 16:40:05 +02:00
Lincoln Stein
f2a6f0cf21 SDXL & SDXL-refiner models convert correctly 2023-07-23 09:31:14 -04:00
Alexandre Macabies
07a90c0198 Fix incorrect use of a singleton list.
This was found through pylance type errors. Go types!
2023-07-23 15:28:05 +02:00
Lincoln Stein
5e59edfaf1 SDXL checkpoint models now convert and load; needs refactor 2023-07-23 00:00:31 -04:00
Lincoln Stein
b1d7c9b306 save text_encoder_2 config, not whole model 2023-07-22 21:33:40 -04:00
Lincoln Stein
5607794dbb add support for controlnet & sdxl conversion - not fully working 2023-07-22 20:12:16 -04:00
Lincoln Stein
845d1524ad warn, do not crash, when duplicate models encountered 2023-07-21 15:00:55 -04:00
Brandon Rising
78750042f5 Pass in dim overrides 2023-07-21 12:16:24 -04:00
psychedelicious
3f79812dc6 fix: mps attention fix for sd2 2023-07-21 09:22:37 -04:00
Lincoln Stein
5962d96f27
Merge branch 'main' into fix/long_tensors_mps 2023-07-20 23:24:47 -04:00
Lincoln Stein
9370572169 prettify startup messages 2023-07-20 22:45:35 -04:00
Sergey Borisov
e6d890888c Replace SlicedAttnProcessor with patched to chunk memory consumption less then 4gb in each attention calculation pass 2023-07-21 04:08:49 +03:00
Lincoln Stein
85ef3f51e7 extra check for empty hftoken 2023-07-20 15:16:06 -04:00
Brandon Rising
ba1a934297 Fix Lora typings 2023-07-20 14:02:23 -04:00
Brandon Rising
4e90376d11 Allow passing in of precision, use available providers if none provided 2023-07-20 13:15:45 -04:00
Lincoln Stein
7deafa838b merge with main 2023-07-20 11:45:54 -04:00
Lincoln Stein
b1a6ba552b reinitialize models.yaml if corrupt or missing 2023-07-20 11:26:20 -04:00
Lincoln Stein
89a15f78dd collapse all autoimport directories into a single folder 2023-07-20 09:01:49 -04:00
Lincoln Stein
cb29ac63a8 prevent crashes on quick install when hftoken not defined 2023-07-20 08:38:37 -04:00
user1
bab8b6d240 Removed diffusers_pipeline prepare_control_image() -- replaced with controlnet_utils.prepare_control_image()
Added resize_mode to ControlNetData class.
2023-07-20 00:41:49 -07:00
Lincoln Stein
12cae33dcd
fix inpaint model detection (#3843)
Co-authored-by: Lincoln Stein <lstein@gmail.com>
2023-07-20 12:57:14 +12:00
Brandon Rising
43b6a077fb io binding seems to be massively resource intensive compared to session.run 2023-07-19 17:42:28 -04:00
Lincoln Stein
a1251c8e04 fix inpaint model detection 2023-07-19 13:30:00 -04:00
Lincoln Stein
8439e30798
Merge branch 'main' into release/invokeai-3-0-beta 2023-07-19 12:09:32 -04:00
Lincoln Stein
9c3a556813
Merge branch 'main' into fix/transformers_4_31_0 2023-07-19 09:35:52 -04:00
Lincoln Stein
0b6ef7eb7d make the convert VAE available to model manager for use in UI 2023-07-19 09:05:24 -04:00
Brandon Rising
e8299d0abb Comment out erroniously removed del statement, comment out opt tests 2023-07-18 23:23:34 -04:00
Sergey Borisov
2e7fc055c4 Support both pre and post 4.31.0 transformers 2023-07-19 06:15:17 +03:00
Brandon Rising
f4e52fafac Fix as part of merging main in 2023-07-18 23:05:33 -04:00
Lincoln Stein
0f7e329e76 restore access token-saving code 2023-07-18 22:58:56 -04:00
Brandon Rising
ee7b36cea5 Merge branch 'main' into onnx-testing 2023-07-18 22:56:41 -04:00
Lincoln Stein
a690cca5b5 make convert work with both 4.30.2 and 4.31.0 2023-07-18 22:18:13 -04:00
Brandon Rising
e201ad2f51 Switch to io_binding for run, testing different session options 2023-07-18 21:54:54 -04:00
Sergey Borisov
0aa7193d3b Load text_model.embeddings.position_ids outsude state_dict 2023-07-19 04:18:43 +03:00
Lincoln Stein
2fbf245c3d Merge branch 'main' into release/invokeai-3-0-beta
-- this adds the upscaling support
2023-07-18 21:17:15 -04:00
Lincoln Stein
39c14eb2ac fix pretrained model download to work with xl 2023-07-18 21:10:33 -04:00
Lincoln Stein
893e199677
Merge branch 'main' into feat/ui/upscale 2023-07-18 19:18:55 -04:00
blessedcoolant
186e98da5e
Merge branch 'main' into fix/mem_cleanup 2023-07-19 10:10:32 +12:00
Eugene Brodsky
dea9a5da7a
Avoid crash if unable to modify the model config file (#3824)
* fix whitespace; remove invisible characters
* log error and proceed if unable to modify the model config
2023-07-18 16:33:19 -04:00
Sergey Borisov
bda0000acd Cleanup vram after models offloading, tweak to cleanup local variable references on ram offload 2023-07-18 23:21:18 +03:00
Brandon Rising
35d5ef9118 Emit step completions 2023-07-18 12:35:07 -04:00
StAlKeR7779
889b77d3d6
Merge branch 'main' into save_vram 2023-07-18 16:55:48 +03:00
Sergey Borisov
bc11296a5e Disable lazy offloading on disabled vram cache, move resulted tensors to cpu(to not stack vram tensors in cache), fix - text encoder not freed(detach) 2023-07-18 16:20:25 +03:00
psychedelicious
42c440c73f
Merge branch 'main' into feat/ui/upscale 2023-07-18 22:08:02 +10:00
Lincoln Stein
9c3c393b84 merge with main 2023-07-18 07:00:55 -04:00
psychedelicious
56098f370c feat(nodes): add RealESRGAN_x2plus.pth, update upscale nodes
- add `RealESRGAN_x2plus.pth` model to installer
- add `RealESRGAN_x2plus.pth` to `realesrgan` node
- rename `RealESRGAN` to `ESRGAN` in nodes
- make `scale_factor` optional in `img_scale` node
2023-07-18 14:55:18 +10:00
Lincoln Stein
1353bf98b3 add specific exception for model probe failures 2023-07-17 23:08:39 -04:00
Lincoln Stein
025cda3815 fix 424 error on model import 2023-07-17 22:21:11 -04:00
blessedcoolant
13da881953 Merge branch 'main' into sdxl-support 2023-07-18 13:34:07 +12:00
blessedcoolant
ec3c15ead0 Merge branch 'main' into mm-ui 2023-07-18 12:58:57 +12:00
Sergey Borisov
0fce35c54c Cleanup, fix variable name, fix controlnet for sequential and cross attention guidance 2023-07-17 23:53:50 +03:00
Brandon Rising
bcce70fca6 Testing different session opts, added timings for testing 2023-07-17 16:27:33 -04:00
Sergey Borisov
1c680a7147 Fix - encoder_attention_mask not passed before to unet, even if passed it will broke sequential guidance run, so rewrite logic 2023-07-17 23:13:37 +03:00
Lincoln Stein
0ea8d3c30c prevent crash on rename operation on models in models directory 2023-07-17 07:50:06 -04:00
Lincoln Stein
84a13ff8e1 Merge branch 'mm-ui' of github.com:blessedcoolant/InvokeAI into mm-ui 2023-07-17 07:29:35 -04:00
Lincoln Stein
3fba262c94 expose paths as absolute to web api 2023-07-17 07:29:26 -04:00
Lincoln Stein
107ca6bf47 expose model paths as absolute to web models API 2023-07-17 07:26:05 -04:00
blessedcoolant
cbfd1d1b27
Merge branch 'main' into feat/standalone_diffusers_ti 2023-07-17 22:01:52 +12:00
blessedcoolant
aebd595607 Merge branch 'main' into mm-ui 2023-07-17 13:49:25 +12:00
Lincoln Stein
ed88e72412 correct cannot assign to field 'unconditioned_embeddings' error 2023-07-16 21:06:40 -04:00
Sergey Borisov
6aefd8600a Fix error with long prompts when controlnet used 2023-07-16 21:06:40 -04:00
Lincoln Stein
ccb43d5a91 make check for 2.3 root directory more stringent 2023-07-16 20:43:15 -04:00
Kent Keirsey
675a92401c
Merge branch 'main' into lstein/default-model-install 2023-07-16 19:32:59 -04:00
Sergey Borisov
b61c83e836 Allow bin extension to detect diffusers-ti provided as file 2023-07-17 00:32:17 +03:00
Lincoln Stein
2bc3e36bc0 add missing exception name 2023-07-16 16:14:28 -04:00
Lincoln Stein
6fbb5ce780 add renaming capabilities to model update API route 2023-07-16 14:17:05 -04:00
Brandon Rising
932112b640 testing being super wasteful with data 2023-07-16 00:17:33 -04:00
Lincoln Stein
b767b5d44c user must adjust terminal size on Windows 2023-07-15 23:19:50 -04:00
Lincoln Stein
e95cb3aa71 Merge branch 'lstein/default-model-install' into release/invokeai-3-0-beta 2023-07-15 20:16:51 -04:00
Lincoln Stein
5b5d5ec978
Merge branch 'main' into sdxl-support 2023-07-15 19:49:57 -04:00
Lincoln Stein
ccbfa5d862 resolve conflicts 2023-07-15 19:47:50 -04:00
Lincoln Stein
7fa394912d
Merge branch 'main' into lstein/default-model-install 2023-07-15 18:26:35 -04:00
Lincoln Stein
6b0a158ffa Merge branch 'main' into lstein/default-model-install 2023-07-15 18:23:34 -04:00
Lincoln Stein
c90345d6a3 deprecate the face restoration option 2023-07-15 18:23:32 -04:00
Lincoln Stein
f66ead0819
Merge branch 'main' into update-textual-inversion-training 2023-07-15 17:44:45 -04:00
Kent Keirsey
77b0129b4c
Merge branch 'main' into lstein/migrate-fix 2023-07-15 10:37:56 -04:00
Lincoln Stein
e01706f5f5 add fp16 support to controlnet models 2023-07-15 10:37:11 -04:00
Lincoln Stein
a111539059 migrate script now initializes destination root if needed 2023-07-15 09:59:34 -04:00
Lincoln Stein
32e7e52d69
Merge branch 'main' into lstein/default-model-install 2023-07-15 08:30:22 -04:00
psychedelicious
d234bf1cb9 feat(install): display full ESRGAN model filenames during installation 2023-07-15 21:27:10 +10:00
blessedcoolant
808b2de709
Merge branch 'main' into lstein/model-manager-route-enhancements 2023-07-15 16:56:54 +12:00
Lincoln Stein
2faa7cee37 add rename_model route 2023-07-14 23:03:18 -04:00
Brandon
467414f214
Merge branch 'main' into update-textual-inversion-training 2023-07-14 22:32:09 -04:00
Brandon Rising
f88a338be0 Setup textual inversion training with new model manager 2023-07-14 21:58:51 -04:00
Lincoln Stein
b306247eb5 remove clipseg model install 2023-07-14 20:39:42 -04:00
Lincoln Stein
a45f7ce355 add --list-models command 2023-07-14 19:52:47 -04:00
Lincoln Stein
eb9d74653d set default models for realesrgan, controlnet and text inversion 2023-07-14 19:03:41 -04:00
Sergey Borisov
7093e5d033 Pad conditionings using zeros and encoder_attention_mask 2023-07-15 00:52:54 +03:00
Brandon Rising
bd7b59910d Testing onnx in new ui updates 2023-07-14 14:24:15 -04:00
Lincoln Stein
e71ce83e9c
Merge branch 'main' into lstein/model-manager-route-enhancements 2023-07-14 13:52:55 -04:00
Lincoln Stein
8600aad12b multiple enhancements to model manager REACT API
1. add a /sync route for synchronizing the in-memory model lists to
   models.yaml, the models directory, and the autoimport directories.

2. add optional destination_directories to convert_model and merge_model
   operations.

3. add /ckpt_confs route for retrieving known legacy checkpoint configuration
   files.

4. add /search route for finding all models in a directory located in the server
   filesystem
2023-07-14 13:45:16 -04:00
Lincoln Stein
ad076b1174 add model directory search route 2023-07-14 11:14:33 -04:00
blessedcoolant
16e93c6455 Merge branch 'main' into mm-ui 2023-07-14 15:46:53 +12:00
Brandon Rising
524888bf3b Merge branch 'main' into feat/onnx 2023-07-13 14:23:57 -04:00
blessedcoolant
9348dc8e0d
Merge branch 'main' into fix/controlnet_cfg_inj_cond 2023-07-14 01:11:04 +12:00
blessedcoolant
5a546e66f1 Merge branch 'main' into fix-inpainting 2023-07-13 20:42:13 +12:00
psychedelicious
eb0d55263b fix(mm): make model config attribute names consistent
Our model fields use `model_name`, but the API response uses `name`. Some places use `model_type` but the API response used `type`.

Changed the API response to provide `model_name` and `model_type`, which simplifies how we manage models on the client substantially.
2023-07-13 15:40:05 +10:00
blessedcoolant
7e3b9f1320 fix: Inpaint not working with some schedulers
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-07-13 15:06:03 +12:00
blessedcoolant
71e34ac256 Merge branch 'main' into mm-ui 2023-07-13 12:48:43 +12:00
Sergey Borisov
67c8cf4bc2 Controlnet model detection 2023-07-12 08:50:19 -04:00
Sergey Borisov
a328986b43 Less naive model detection 2023-07-12 08:50:19 -04:00
blessedcoolant
5a6ad99d4e feat: Restore Delete Model Functionality 2023-07-12 16:39:07 +12:00
StAlKeR7779
b8a9b499df
Merge branch 'main' into fix/controlnet_cfg_inj_cond 2023-07-11 23:43:47 +03:00
Lincoln Stein
25591788c1 fix conflicts 2023-07-11 15:55:10 -04:00
Lincoln Stein
dab03fb646 rename gpu_mem_reserved to max_vram_cache_size
To be consistent with max_cache_size, the amount of memory to hold in
VRAM for model caching is now controlled by the max_vram_cache_size
configuration parameter.
2023-07-11 15:25:39 -04:00
Lincoln Stein
d32f9f7cb0 reverse logic of gpu_mem_reserved
- gpu_mem_reserved now indicates the amount of VRAM that will be reserved
  for model caching (similar to max_cache_size).
2023-07-11 15:16:40 -04:00
Lincoln Stein
6b93c1451f do not crash when probing an unknown model type 2023-07-11 10:56:47 -04:00
Lincoln Stein
bf2b5b5cd4 improvements to sdxl support in model manager
- Move SDXL-related models to models/sdxl.py
- Create separate base type BaseModelType.StableDiffusionXLRefiner for the refiner
  models.
2023-07-09 20:42:03 -04:00
Lincoln Stein
5759a390f9 introduce gpu_mem_reserved configuration parameter 2023-07-09 18:35:04 -04:00
Lincoln Stein
130249a2dd add model loading support for SDXL 2023-07-09 15:47:06 -04:00
Lincoln Stein
8d7dba937d fix undefined variable 2023-07-09 14:37:45 -04:00
Lincoln Stein
d6cb0e54b3 don't unload models from GPU until the space is needed 2023-07-09 14:26:30 -04:00
Lincoln Stein
5f7435955e if models.yaml doesn't exist, rebuild it 2023-07-08 15:13:51 -04:00
Lincoln Stein
69ef1e1e56 speculative change to upgrade script 2023-07-08 11:45:26 -04:00
Lincoln Stein
10d3bccf32
Mac MPS FP16 fixes (#3641)
This PR is to allow FP16 precision to work on Macs with MPS. In
addition, it centralizes the torch fixes/workarounds required for MPS
into a new backend utility `mps_fixes.py`. This is conditionally
imported in `api_app.py`/`cli_app.py`.

Many MANY thanks to @StAlKeR7779 for patiently working to debug and fix
these issues.
2023-07-07 17:43:23 -04:00
Eugene Brodsky
97b2ec58e2
Merge branch 'main' into release/invokeai-3-0-alpha 2023-07-07 14:18:12 -04:00
Lincoln Stein
56f4712814 fix checkpoint VAE handling in migrate script 2023-07-07 09:34:42 -04:00
Lincoln Stein
9f58ed35cf improve user migration experience
- No longer fail root directory probing if invokeai.yaml is missing
  (test is now whether a `models/core` directory exists).
- Migrate script does not overwrite previously-installed models.
- Can run migrate script on an existing 2.3 version directory
  with --from and --to pointing to same 2.3 root.
2023-07-07 08:18:46 -04:00
blessedcoolant
7aa918677e Merge branch 'main' into feat/clip_skip 2023-07-07 16:21:53 +12:00
Lincoln Stein
54f3686e3b merge with main, fix conflicts 2023-07-06 15:21:45 -04:00
Lincoln Stein
e9352227f3 add merge api 2023-07-06 15:12:34 -04:00
blessedcoolant
bc5371eeee
Merge branch 'main' into feat/clip_skip 2023-07-07 06:03:39 +12:00
Lincoln Stein
e573a533ae remove redundant import 2023-07-06 13:24:58 -04:00
Lincoln Stein
581be42c75
Merge branch 'main' into lstein/model-manager-router-api 2023-07-06 13:20:36 -04:00
Lincoln Stein
90c66aab3d merge with upstream 2023-07-06 13:17:02 -04:00
Lincoln Stein
3e925fbf34 model merging API ready for testing 2023-07-06 13:15:15 -04:00
Lincoln Stein
ec7c2f07c6 model merge backend, CLI and TUI working 2023-07-06 12:21:42 -04:00
blessedcoolant
b229fe19aa
Merge branch 'main' into lstein/configure-max-cache-size 2023-07-07 01:52:12 +12:00
Sergey Borisov
04b57c408f Add clip skip option to prompt node 2023-07-06 16:09:40 +03:00
blessedcoolant
6f1268e2b1
Merge branch 'main' into lstein/more-model-loading-fixes 2023-07-07 00:32:22 +12:00
Lincoln Stein
8f5fcb188c
Merge branch 'main' into lstein/model-manager-router-api 2023-07-05 23:16:43 -04:00
Lincoln Stein
f7daa6e71d all methods now return OPENAPI_MODEL_CONFIGS; convert uses PUT 2023-07-05 23:13:01 -04:00
Lincoln Stein
3691b55565 fix autoimport crash 2023-07-05 21:53:08 -04:00
Lincoln Stein
f610045a14
Merge branch 'main' into mps-fp16-fixes 2023-07-05 21:01:48 -04:00
Lincoln Stein
a7cbcae176 expose max_cache_size to invokeai-configure interface 2023-07-05 20:59:57 -04:00
Lincoln Stein
0a6dccd607 expose max_cache_size to invokeai-configure interface 2023-07-05 20:59:14 -04:00
Lincoln Stein
43c51ff157
Merge branch 'main' into lstein/more-model-loading-fixes 2023-07-05 20:48:15 -04:00
Lincoln Stein
cfa3b2419c partial implementation of merge 2023-07-05 20:25:47 -04:00
Lincoln Stein
d4550b3059 clean up lint errors in lora.py 2023-07-05 19:18:25 -04:00
Lincoln Stein
83d3a043da merge latest changes from main 2023-07-05 19:15:53 -04:00
gogurtenjoyer
169ff6368b
Update mps_fixes.py - additional torch op for nodes
This fixes scaling in the nodes UI.
2023-07-05 17:47:23 -04:00
Lincoln Stein
71dad6d404
Merge branch 'main' into ti-ui 2023-07-05 16:57:31 -04:00
Lincoln Stein
685a47cc7d fix crash during lora application 2023-07-05 16:40:47 -04:00
Lincoln Stein
cb947bcbf0
Merge branch 'main' into lstein/fix-migrate3-textencoder 2023-07-05 16:23:00 -04:00
Lincoln Stein
f8bbec8572 Recognize and load diffusers-style LoRAs (.bin)
Prevent double-reporting of autoimported models
- closes #3636

Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:21:23 -04:00
Lincoln Stein
863336acbb Recognize and load diffusers-style LoRAs (.bin)
Prevent double-reporting of autoimported models
- closes #3636

Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:19:16 -04:00
Lincoln Stein
90ae8ce26a prevent model install crash "torch needs to be restarted with spawn" 2023-07-05 16:18:20 -04:00
Lincoln Stein
ad5d90aca8 prevent model install crash "torch needs to be restarted with spawn" 2023-07-05 15:38:07 -04:00
Lincoln Stein
5b6dd47b9f add API for model convert 2023-07-05 15:13:21 -04:00
Lincoln Stein
5027d0a603 accept @psychedelicious suggestions above 2023-07-05 14:50:57 -04:00
Lincoln Stein
9f9ce08e44
Merge branch 'main' into lstein/remove-hardcoded-cuda-device 2023-07-05 13:38:33 -04:00
Lincoln Stein
021e1eca8e
Merge branch 'main' into mps-fp16-fixes 2023-07-05 13:19:52 -04:00
Lincoln Stein
5fe722900d allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory 2023-07-05 13:15:08 -04:00
Lincoln Stein
cf173b522b allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory 2023-07-05 13:14:41 -04:00
blessedcoolant
9e2d63ef97
Merge branch 'main' into fix/ckpt_convert_scan 2023-07-06 05:01:34 +12:00
Sergey Borisov
0ac9dca926 Fix loading diffusers ti 2023-07-05 19:46:00 +03:00
Lincoln Stein
bd82c4ace0 model installer confirms deletion of models 2023-07-05 09:57:23 -04:00
Lincoln Stein
9edf78dd2e merge with main 2023-07-05 09:12:54 -04:00
Lincoln Stein
6112197edf convert implemented; need router 2023-07-05 09:05:05 -04:00
gogurtenjoyer
ba7345deb4
Merge branch 'main' into mps-fp16-fixes 2023-07-05 07:38:41 -04:00
Sergey Borisov
ee042ab76d Fix ckpt scanning on conversion 2023-07-05 14:18:30 +03:00
blessedcoolant
780e77d2ae
Merge branch 'main' into fix/clip_path 2023-07-05 22:45:52 +12:00
Sergey Borisov
e3fc1b3816 Fix clip path in migrate script 2023-07-05 13:43:09 +03:00
Lincoln Stein
307a01d604 when migrating models, changes / to _ in model names to avoid breaking model name keys 2023-07-05 20:27:03 +10:00
Sergey Borisov
2beb8f049e Fix model detection 2023-07-05 09:43:46 +03:00
blessedcoolant
639d88afd6 revert: inference_mode to no_grad 2023-07-05 16:39:15 +12:00
blessedcoolant
c0501ed5c2 fix: Slow loading of Loras
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-07-05 12:47:34 +10:00
gogurtenjoyer
233869b56a Mac MPS FP16 fixes
This PR is to allow FP16 precision to work on Macs with MPS. In addition, it centralizes the torch fixes/workarounds
required for MPS into a new backend utility file `mps_fixes.py`. This is conditionally imported in `api_app.py`/`cli_app.py`.

Many MANY thanks to StAlKeR7779 for patiently working to debug and fix these issues.
2023-07-04 18:10:53 -04:00
Lincoln Stein
5d099f4a49 update_model working 2023-07-04 17:26:57 -04:00
Lincoln Stein
752b4d50cf model_delete method now working 2023-07-04 10:40:32 -04:00
Lincoln Stein
c1c49d9a76 import model returns 404 for invalid path, 409 for duplicate model 2023-07-04 10:08:10 -04:00
Lincoln Stein
96bf92ead4 add the import model router 2023-07-04 14:35:47 +10:00
Lincoln Stein
fc419546bc
Merge branch 'main' into lstein/remove-hardcoded-cuda-device 2023-07-03 14:10:47 -04:00
Lincoln Stein
cfd09214d3
Merge branch 'main' into lstein/fix-vae-conversion-crash 2023-07-03 14:03:13 -04:00
Lincoln Stein
b128ba81db
Merge branch 'main' into lstein/remove-hardcoded-cuda-device 2023-07-03 13:58:14 -04:00
Lincoln Stein
d6de11bd56 resolve merge conflict 2023-07-03 12:19:11 -04:00
Lincoln Stein
ed86d0b708 Union[foo, None]=>Optional[foo] 2023-07-03 12:17:45 -04:00
Lincoln Stein
fb2b2a371d Merge branch 'lstein/fix-vae-conversion-crash' into release/invokeai-3-0-alpha 2023-07-03 11:21:16 -04:00
Lincoln Stein
10d513c5f7 add runtime root path to relative vaes and other submodels 2023-07-03 11:19:33 -04:00
Lincoln Stein
877b187a1b Merge branch 'lstein/restore-3.9-compatibility' into release/invokeai-3-0-alpha 2023-07-03 11:01:34 -04:00
Lincoln Stein
ac9ec4e75a restore 3.9 compatibility by replacing | with Union[] 2023-07-03 10:57:40 -04:00
Lincoln Stein
2465c7987b Revert "restore 3.9 compatibility by replacing | with Union[]"
This reverts commit 76bafeb99e.
2023-07-03 10:56:41 -04:00
Lincoln Stein
76bafeb99e restore 3.9 compatibility by replacing | with Union[] 2023-07-03 10:55:04 -04:00
Lincoln Stein
6935858ef3 add debugging messages to aid in memory leak tracking 2023-07-02 13:34:53 -04:00
Lincoln Stein
fa1f9939cc adjust invokeai-configure TUI vertical height to show NEXT button on Mac 2023-07-02 09:44:16 -04:00
Lincoln Stein
2d314d2b3d another fix to repo_id loading 2023-07-02 09:18:11 -04:00
Lincoln Stein
b2775d6b4c Merge branch 'lstein/recognize-legacy-sampler-names' into release/invokeai-3-0-alpha 2023-07-01 21:45:39 -04:00
Lincoln Stein
06694d465d add missing k-* legacy sampler names to init file migrate list 2023-07-01 21:45:14 -04:00
Lincoln Stein
3c2ce51f10 Merge branch 'lstein/remove-hardcoded-cuda-device' into release/invokeai-3-0-alpha 2023-07-01 21:15:58 -04:00
Lincoln Stein
0f02915012 remove hardcoded cuda device in model manager init 2023-07-01 21:15:42 -04:00
Lincoln Stein
0016236889 Merge branch 'lstein/fix-imported-model-names' into release/invokeai-3-0-alpha 2023-07-01 21:09:29 -04:00
Lincoln Stein
f4bd5bb986 when migrating models, changes / to _ in model names to avoid breaking model name keys 2023-07-01 21:08:59 -04:00
Lincoln Stein
5de820f2dc fix updater and model installer 2023-07-01 20:13:28 -04:00
Lincoln Stein
41a8f155ed
Merge branch 'main' into fix/controlnet_cfg_inj_cond 2023-07-01 14:36:09 -04:00
Lincoln Stein
f1928d2588 prevent crashes on malformed models 2023-07-01 14:32:58 -04:00
blessedcoolant
c74bb5cdbf
Merge branch 'main' into lstein/fix-vae-convert 2023-07-01 11:18:21 +12:00
Lincoln Stein
1347fc2f00 fix incorrect VAE config file path during conversion of ckpts 2023-06-30 19:14:06 -04:00
blessedcoolant
5be1e71d1b
Merge branch 'main' into lstein/fix-model-scan-on-rel-root 2023-06-29 17:54:12 +12:00
mickr777
30a917f70c
Fix Typo in migrate_to_3.py 2023-06-29 14:45:55 +10:00
Lincoln Stein
ace4f6d586 fix duplicate model key addition when root directory is a relative path 2023-06-28 17:02:03 -04:00
Lincoln Stein
20fbe81395
Merge branch 'main' into fix/controlnet_cfg_inj_cond 2023-06-28 15:44:50 -04:00
StAlKeR7779
ac46b129bf
Merge branch 'main' into feat/lora_model_patch 2023-06-28 22:43:58 +03:00
Lincoln Stein
79fc708580 warn but do not crash when model scan finds random cruft in models directory 2023-06-28 15:26:42 -04:00
Lincoln Stein
e8ed0fad6c autoimport from embedding/controlnet/lora folders designated in startup file 2023-06-27 12:30:53 -04:00
Sergey Borisov
dc1f220b3e Fix wrong conditioning used 2023-06-27 01:18:15 +03:00
Lincoln Stein
044fe6bb20 remove dangling debug statement 2023-06-26 17:48:06 -04:00
Lincoln Stein
823e098b7c prompt user for prediction type when autoimporting a v2 model without .yaml file
don't ask user for prediction type of a config.yaml provided
2023-06-26 16:30:34 -04:00
Lincoln Stein
011adfc958 merge with main 2023-06-26 13:53:59 -04:00
Lincoln Stein
befd95eb19 rename root_dir to root_path attributes to emphasize return of a Path 2023-06-26 13:52:25 -04:00
Lincoln Stein
a2ddb3823b fix add_model() logic 2023-06-26 13:33:38 -04:00
Eugene Brodsky
7b97639961
Merge branch 'main' into lstein/installer-for-new-model-layout 2023-06-26 01:24:30 -04:00
Sergey Borisov
91c3a58fb6 Fix lycoris layers init 2023-06-26 04:33:37 +03:00
Sergey Borisov
5cebf67ee4 Apply lora by patching lora instead of hooks 2023-06-26 03:57:33 +03:00
Sergey Borisov
1ba94a92b3 Fixes 2023-06-26 03:54:42 +03:00
Sergey Borisov
23c22ac933 Refactor logic/small fixes 2023-06-26 03:07:54 +03:00
Lincoln Stein
160b5d7992 add support for an autoimport models directory scanned at startup time 2023-06-25 18:50:15 -04:00
Lincoln Stein
c91d1eacba Merge branch 'lstein/installer-for-new-model-layout' of github.com:invoke-ai/InvokeAI into lstein/installer-for-new-model-layout 2023-06-25 16:04:48 -04:00
Lincoln Stein
60b37b7ff4 fix model manager documentation 2023-06-25 16:04:43 -04:00
Sergey Borisov
a3c22b5fe6 Remove upcast_attention and prediction_type from stable diffusion model logic, fix ckpt conversion according to this 2023-06-25 21:06:22 +03:00
user1
c5faffc18b Merge branch 'main' of github.com:invoke-ai/InvokeAI into feat/controlnet-control-modes
Only "real" conflicts were in:
     invokeai/frontend/web/src/features/controlNet/components/ControlNet.tsx
     invokeai/frontend/web/src/features/controlNet/store/controlNetSlice.ts
2023-06-24 17:05:57 -07:00
Lincoln Stein
c3c4a71173 implemented Stalker's suggested improvements 2023-06-24 12:37:26 -04:00
Lincoln Stein
ba1371a88f rename ModelType.Pipeline to ModelType.Main 2023-06-24 11:45:49 -04:00
Lincoln Stein
539d1f3bde remove redundant prediction_type and attention_upscaling flags 2023-06-23 16:54:52 -04:00
Lincoln Stein
466ec3ab5e add router API support for model manager heuristic_import()` 2023-06-23 16:35:39 -04:00
Lincoln Stein
54b74427f4 adjust for change in list_models() API 2023-06-23 14:13:37 -04:00
Lincoln Stein
58d1857ab6 merge with main 2023-06-23 13:57:25 -04:00
Lincoln Stein
3043af4620 implement vae passthru 2023-06-23 13:56:30 -04:00
Lincoln Stein
56bd873d7a make relative model paths work in model manager 2023-06-23 10:52:59 -04:00
Sergey Borisov
5aaaaf64a1 Fix ckpt conversion 2023-06-23 17:29:54 +03:00
StAlKeR7779
9140e2c0f2
Merge branch 'main' into fix/vae_conversion 2023-06-23 15:03:59 +03:00
Lincoln Stein
a910403003 correctly migrate models that have relative paths 2023-06-22 21:10:31 -04:00