InvokeAI/invokeai/backend/model_manager
Lincoln Stein a3cb5da130
Improve RAM<->VRAM memory copy performance in LoRA patching and elsewhere (#6490)
* allow model patcher to optimize away the unpatching step when feasible

* remove lazy_offloading functionality

* allow model patcher to optimize away the unpatching step when feasible

* remove lazy_offloading functionality

* do not save original weights if there is a CPU copy of state dict

* Update invokeai/backend/model_manager/load/load_base.py

Co-authored-by: Ryan Dick <ryanjdick3@gmail.com>

* documentation fixes requested during penultimate review

* add non-blocking=True parameters to several torch.nn.Module.to() calls, for slight performance increases

* fix ruff errors

* prevent crash on non-cuda-enabled systems

---------

Co-authored-by: Lincoln Stein <lstein@gmail.com>
Co-authored-by: Kent Keirsey <31807370+hipsterusername@users.noreply.github.com>
Co-authored-by: Ryan Dick <ryanjdick3@gmail.com>
2024-06-13 17:10:03 +00:00
..
load Improve RAM<->VRAM memory copy performance in LoRA patching and elsewhere (#6490) 2024-06-13 17:10:03 +00:00
metadata refactor model_install to work with refactored download queue 2024-05-13 22:49:15 -04:00
util install model if diffusers or single file, cleaned up backend logic to not mess with existing model install 2024-03-13 21:02:29 +11:00
__init__.py [mm] Do not write diffuser model to disk when convert_cache set to zero (#6072) 2024-03-29 16:11:08 -04:00
config.py bad implementation of diffusers folder download 2024-05-08 21:21:01 -07:00
convert_ckpt_to_diffusers.py feat(mm): use same pattern for vae converter as others 2024-04-01 12:34:49 +11:00
libc_util.py Tidy names and locations of modules 2024-03-01 10:42:33 +11:00
merge.py [util] Add generic torch device class (#6174) 2024-04-15 13:12:49 +00:00
probe.py feat(mm): support sdxl ckpt inpainting models 2024-04-28 12:57:27 +10:00
search.py docs(mm): update ModelSearch 2024-03-28 12:35:41 +11:00
starter_models.py fix: update SDXL IP Adpater starter model to be ViT-H 2024-04-24 00:08:21 -04:00