InvokeAI/invokeai/backend
Lincoln Stein d840c597b5 fix --png_compression command line argument
- The value of png_compression was always 6, despite the value provided to the
  --png_compression argument. This fixes the bug.
- It also fixes an inconsistency between the maximum range of png_compression
  and the help text.

- Closes #2945
2023-03-14 00:24:05 -04:00
..
config during migration do not overwrite symlinks 2023-03-05 08:40:12 -05:00
generator Removed seed from get_make_image. 2023-03-13 08:15:46 -05:00
image_util all vestiges of ldm.invoke removed 2023-03-03 01:02:00 -05:00
model_management [fix] Get the model again if current model is empty 2023-03-12 22:26:11 -04:00
prompting backend..conditioning: remove code for legacy model 2023-03-09 18:15:12 -08:00
restoration remove legacy ldm code 2023-03-04 18:16:59 -08:00
stable_diffusion Fix bug #2931 2023-03-13 08:11:09 -05:00
training migrate to new HF diffusers cache location 2023-03-05 08:20:24 -05:00
util all vestiges of ldm.invoke removed 2023-03-03 01:02:00 -05:00
web backend..conditioning: remove code for legacy model 2023-03-09 18:15:12 -08:00
__init__.py restore NSFW checker 2023-03-11 16:16:44 -05:00
args.py fix --png_compression command line argument 2023-03-14 00:24:05 -04:00
generate.py add restoration services to nodes 2023-03-11 17:00:00 -05:00
globals.py Unified spelling of Hugging Face 2023-03-05 07:30:35 -06:00
safety_checker.py restore NSFW checker 2023-03-11 16:16:44 -05:00