InvokeAI/invokeai/backend/stable_diffusion
psychedelicious 2526ef52c5 fix(nodes): workaround seamless multi gpu error #6010
The seamless logic errors when a second GPU is selected. I don't understand why, but a workaround is to skip the model patching when there there are no seamless axes specified.

This is also just a good practice regardless - don't patch the model unless we need to. Probably a negligible perf impact.

Closes #6010
2024-03-29 08:56:38 +11:00
..
diffusion cleanup: remove unused scripts, cruft 2024-03-20 15:05:25 +11:00
schedulers make model manager v2 ready for PR review 2024-03-01 10:42:33 +11:00
__init__.py Remove unused code for attention map saving. 2024-03-02 08:25:41 -05:00
diffusers_pipeline.py cleanup: remove unused scripts, cruft 2024-03-20 15:05:25 +11:00
seamless.py fix(nodes): workaround seamless multi gpu error #6010 2024-03-29 08:56:38 +11:00