Commit Graph

26 Commits

Author SHA1 Message Date
Lincoln Stein
a958ae5e29 Merge branch 'main' into feat/use-custom-vaes 2023-03-23 10:32:56 -04:00
Lincoln Stein
3ca654d256 speculative fix for alternative vaes 2023-03-13 23:27:29 -04:00
jeremy
e0e01f6c50 Reduced Pickle ACE attack surface
Prior to this commit, all models would be loaded with the extremely unsafe `torch.load` method, except those with the exact extension `.safetensors`. Even a change in casing (eg. `saFetensors`, `Safetensors`, etc) would cause the file to be loaded with torch.load instead of the much safer `safetensors.toch.load_file`.
If a malicious actor renamed an infected `.ckpt` to something like `.SafeTensors` or `.SAFETENSORS` an unsuspecting user would think they are loading a safe .safetensor, but would in fact be parsing an unsafe pickle file, and executing an attacker's payload. This commit fixes this vulnerability by reversing the loading-method decision logic to only use the unsafe `torch.load` when the file extension is exactly `.ckpt`.
2023-03-13 16:16:30 -04:00
Kyle Schouviller
3ee2798ede [fix] Get the model again if current model is empty 2023-03-12 22:26:11 -04:00
Lincoln Stein
6a77634b34 remove unneeded generate initializer routines 2023-03-11 17:14:03 -05:00
Lincoln Stein
c14241436b move ModelManager initialization into its own module and restore embedding support 2023-03-11 10:56:53 -05:00
Lincoln Stein
d612f11c11 initialize InvokeAIGenerator object with model, not manager 2023-03-11 09:06:46 -05:00
Lincoln Stein
fe75b95464 Merge branch 'refactor/nodes-on-generator' of github.com:invoke-ai/InvokeAI into refactor/nodes-on-generator 2023-03-10 19:36:40 -05:00
Lincoln Stein
95954188b2 remove factory pattern
Factory pattern is now removed. Typical usage of the InvokeAIGenerator is now:

```
from invokeai.backend.generator import (
    InvokeAIGeneratorBasicParams,
    Txt2Img,
    Img2Img,
    Inpaint,
)
    params = InvokeAIGeneratorBasicParams(
        model_name = 'stable-diffusion-1.5',
        steps = 30,
        scheduler = 'k_lms',
        cfg_scale = 8.0,
        height = 640,
        width = 640
        )
    print ('=== TXT2IMG TEST ===')
    txt2img = Txt2Img(manager, params)
    outputs = txt2img.generate(prompt='banana sushi', iterations=2)

    for i in outputs:
        print(f'image={output.image}, seed={output.seed}, model={output.params.model_name}, hash={output.model_hash}, steps={output.params.steps}')
```

The `params` argument is optional, so if you wish to accept default
parameters and selectively override them, just do this:

```
    outputs = Txt2Img(manager).generate(prompt='banana sushi',
                                        steps=50,
					scheduler='k_heun',
					model_name='stable-diffusion-2.1'
					)
```
2023-03-10 19:33:04 -05:00
Jonathan
370e8281b3
Merge branch 'main' into refactor/nodes-on-generator 2023-03-10 12:34:00 -06:00
Lincoln Stein
14c8738a71 fix dangling reference to _model_to_cpu and missing variable model_description 2023-03-09 21:41:45 -05:00
Kevin Turner
ad7b1fa6fb model_manager: model to/from CPU methods are implemented on the Pipeline 2023-03-09 18:15:12 -08:00
Lincoln Stein
b679a6ba37 model manager defaults to consistent values of device and precision 2023-03-09 01:09:54 -05:00
Lincoln Stein
7c60068388
Merge branch 'main' into bugfix/fix-convert-sd-to-diffusers-error 2023-03-06 08:20:29 -05:00
Lincoln Stein
94daaa4abf fix call signature of import_diffuser_model() 2023-03-05 23:37:59 -05:00
Lincoln Stein
2f9dcd7906 support both epsilon and v-prediction v2 inference
There are actually two Stable Diffusion v2 legacy checkpoint
configurations:

1) "epsilon" prediction type for Stable Diffusion v2 Base
2) "v-prediction" type for Stable Diffusion v2-768

This commit adds the configuration file needed for epsilon prediction
type models as well as the UI that prompts the user to select the
appropriate configuration file when the code can't do so
automatically.
2023-03-05 22:51:40 -05:00
blessedcoolant
e537b5d8e1 Revert "Merge branch 'main' into bugfix/reenable-ckpt-conversion-to-ram"
This reverts commit e0e70c9222, reversing
changes made to 0b184913b9.
2023-03-06 14:29:39 +13:00
blessedcoolant
e0e70c9222 Merge branch 'main' into bugfix/reenable-ckpt-conversion-to-ram 2023-03-06 14:27:30 +13:00
Lincoln Stein
fc187f263e deal with non-directories in diffusers/ 2023-03-05 12:29:52 -05:00
Lincoln Stein
4e9e1b660d respect HF_HOME setting when migrating 2023-03-05 12:08:29 -05:00
Lincoln Stein
d01adedff5 give user chance to back out before migration 2023-03-05 12:04:31 -05:00
Lincoln Stein
b33655b0d6 restore automatic conversion of legacy files to diffusers pipelines 2023-03-05 11:45:25 -05:00
Lincoln Stein
81dee04dc9 during migration do not overwrite symlinks 2023-03-05 08:40:12 -05:00
Lincoln Stein
ef8cf83b28 migrate to new HF diffusers cache location 2023-03-05 08:20:24 -05:00
Lincoln Stein
60a98cacef all vestiges of ldm.invoke removed 2023-03-03 01:02:00 -05:00
Lincoln Stein
6a990565ff all files migrated; tweaks needed 2023-03-03 00:02:15 -05:00