Lincoln Stein
c14241436b
move ModelManager initialization into its own module and restore embedding support
2023-03-11 10:56:53 -05:00
Lincoln Stein
d612f11c11
initialize InvokeAIGenerator object with model, not manager
2023-03-11 09:06:46 -05:00
Lincoln Stein
250b0ab182
add seamless tiling support
2023-03-11 08:33:23 -05:00
Lincoln Stein
675dd12b6c
add attention map images to output object
2023-03-11 08:07:01 -05:00
Lincoln Stein
7e76eea059
add embiggen, remove complicated constructor
2023-03-11 07:50:39 -05:00
Lincoln Stein
fe75b95464
Merge branch 'refactor/nodes-on-generator' of github.com:invoke-ai/InvokeAI into refactor/nodes-on-generator
2023-03-10 19:36:40 -05:00
Lincoln Stein
95954188b2
remove factory pattern
...
Factory pattern is now removed. Typical usage of the InvokeAIGenerator is now:
```
from invokeai.backend.generator import (
InvokeAIGeneratorBasicParams,
Txt2Img,
Img2Img,
Inpaint,
)
params = InvokeAIGeneratorBasicParams(
model_name = 'stable-diffusion-1.5',
steps = 30,
scheduler = 'k_lms',
cfg_scale = 8.0,
height = 640,
width = 640
)
print ('=== TXT2IMG TEST ===')
txt2img = Txt2Img(manager, params)
outputs = txt2img.generate(prompt='banana sushi', iterations=2)
for i in outputs:
print(f'image={output.image}, seed={output.seed}, model={output.params.model_name}, hash={output.model_hash}, steps={output.params.steps}')
```
The `params` argument is optional, so if you wish to accept default
parameters and selectively override them, just do this:
```
outputs = Txt2Img(manager).generate(prompt='banana sushi',
steps=50,
scheduler='k_heun',
model_name='stable-diffusion-2.1'
)
```
2023-03-10 19:33:04 -05:00
Jonathan
370e8281b3
Merge branch 'main' into refactor/nodes-on-generator
2023-03-10 12:34:00 -06:00
Lincoln Stein
685df33584
fix bug that caused black images when converting ckpts to diffusers in RAM ( #2914 )
...
Cause of the problem was inadvertent activation of the safety checker.
When conversion occurs on disk, the safety checker is disabled during loading.
However, when converting in RAM, the safety checker was not removed, resulting
in it activating even when user specified --no-nsfw_checker.
This PR fixes the problem by detecting when the caller has requested the InvokeAi
StableDiffusionGeneratorPipeline class to be returned and setting safety checker
to None. Do not do this with diffusers models destined for disk because then they
will be incompatible with the merge script!!
Closes #2836
2023-03-10 18:11:32 +00:00
blessedcoolant
7ff77504cb
Make sure command also works with Oh-my-zsh ( #2905 )
...
Many people use oh-my-zsh for their command line: https://ohmyz.sh/
Adding `""` should work both on ohmyzsh and native bash
2023-03-10 19:05:22 +13:00
blessedcoolant
0d1854e44a
Merge branch 'main' into patch-1
2023-03-10 19:04:42 +13:00
Lincoln Stein
12c7db3a16
backend: more post-ldm-removal cleanup ( #2911 )
2023-03-09 23:11:10 -05:00
Lincoln Stein
3ecdec02bf
Merge branch 'main' into cleanup/more_ldm_cleanup
2023-03-09 22:49:09 -05:00
Lincoln Stein
d6c24d59b0
Revert "Remove label from stale issues on comment event" ( #2912 )
...
Reverts invoke-ai/InvokeAI#2903
@mauwii has a point here. It looks like triggering on a comment results
in an action for each of the stale issues, even ones that have been
previously dealt with. I'd like to revert this back to the original
behavior of running once every time the cron job executes.
What's the original motivation for having more frequent labeling of the
issues?
2023-03-09 22:28:49 -05:00
Lincoln Stein
bb3d1bb6cb
Revert "Remove label from stale issues on comment event"
2023-03-09 22:24:43 -05:00
Lincoln Stein
14c8738a71
fix dangling reference to _model_to_cpu and missing variable model_description
2023-03-09 21:41:45 -05:00
Kevin Turner
1a829bb998
pipeline: remove code for legacy model
2023-03-09 18:15:12 -08:00
Kevin Turner
9d339e94f2
backend..conditioning: remove code for legacy model
2023-03-09 18:15:12 -08:00
Kevin Turner
ad7b1fa6fb
model_manager: model to/from CPU methods are implemented on the Pipeline
2023-03-09 18:15:12 -08:00
Kevin Turner
42355b70c2
fix(Pipeline.debug_latents): fix import for moved utility function
2023-03-09 18:15:12 -08:00
Kevin Turner
faa2558e2f
chore: add new argument to overridden method to match new signature upstream
2023-03-09 18:15:12 -08:00
Kevin Turner
081397737b
typo: docstring spelling fixes
...
looks like they've already been corrected in the upstream copy
2023-03-09 18:15:12 -08:00
Kevin Turner
55d36eaf4f
fix: image_resized_to_grid_as_tensor: reconnect dropped multiple_of argument
2023-03-09 18:15:12 -08:00
Lincoln Stein
a0065da4a4
Remove label from stale issues on comment event ( #2903 )
...
I found it to be a chore to remove labels manually in order to
"un-stale" issues. This is contrary to the bot message which says
commenting should remove "stale" status. On the current `cron` schedule,
there may be a delay of up to 24 hours before the label is removed. This
PR will trigger the workflow on issue comments in addition to the
schedule.
Also adds a condition to not run this job on PRs (Github treats issues
and PRs equivalently in this respect), and rewords the messages for
clarity.
2023-03-09 20:17:54 -05:00
Lincoln Stein
c11e823ff3
remove unused _wrap_results
2023-03-09 16:30:06 -05:00
Patrick von Platen
507e12520e
Make sure command also works with Oh-my-zsh
...
Many people use oh-my-zsh for their command line: https://ohmyz.sh/
Adding `""` should work both on ohmyzsh and native bash
2023-03-09 19:21:57 +01:00
Eugene Brodsky
5418bd3b24
(ci) unlabel stale issues when commented
2023-03-09 09:22:29 -05:00
blessedcoolant
76d5fa4694
Bypass the 77 token limit ( #2896 )
...
This ought to be working but i don't know how it's supposed to behave so
i haven't been able to verify. At least, I know the numbers are getting
pushed all the way to the SD unet, i just have been unable to verify if
what's coming out is what is expected. Please test.
You'll `need to pip install -e .` after switching to the branch, because
it's currently pulling from a non-main `compel` branch. Once it's
verified as working as intended i'll promote the compel branch to pypi.
2023-03-09 23:52:28 +13:00
blessedcoolant
386dda8233
Merge branch 'main' into feat_longer_prompts
2023-03-09 22:37:10 +13:00
Damian Stewart
8076c1697c
Merge branch 'feat_longer_prompts' of github.com:damian0815/InvokeAI into feat_longer_prompts
2023-03-09 10:28:13 +01:00
Damian Stewart
65fc9a6e0e
bump compel version to address issues
2023-03-09 10:28:07 +01:00
Lincoln Stein
cde0b6ae8d
Merge branch 'main' into refactor/nodes-on-generator
2023-03-09 01:52:45 -05:00
blessedcoolant
b12760b976
[ui] chore(Accessibility): various additions ( #2888 )
...
# Overview
Adding a few accessibility items (I think 9 total items). Mostly
`aria-label`, but also a `<VisuallyHidden>` to the left-side nav tab
icons. Tried to match existing copy that was being used. Feedback
welcome
2023-03-09 19:14:42 +13:00
Lincoln Stein
b679a6ba37
model manager defaults to consistent values of device and precision
2023-03-09 01:09:54 -05:00
ElrikUnderlake
2f5f08c35d
yarn build
2023-03-08 23:51:46 -06:00
Elrik
8f48c14ed4
Merge branch 'main' into chore/accessability_various-additions
2023-03-08 23:49:08 -06:00
Lincoln Stein
5d37fa6e36
node-based txt2img working without generate
2023-03-09 00:18:29 -05:00
Jonathan
f51581bd1b
Merge branch 'main' into feat_longer_prompts
2023-03-08 23:08:49 -06:00
blessedcoolant
50ca6b6ffc
add back pytorch-lightning dependency ( #2899 )
...
- Closes #2893
2023-03-09 17:22:17 +13:00
blessedcoolant
63b9ec4c5e
Merge branch 'main' into bugfix/restore-pytorch-lightning
2023-03-09 16:57:14 +13:00
blessedcoolant
b115bc4247
[cli] Execute commands in-order with nodes ( #2901 )
...
Executes piped commands in the order they were provided (instead of
executing CLI commands immediately).
2023-03-09 16:55:23 +13:00
blessedcoolant
dadc30f795
Merge branch 'main' into bugfix/restore-pytorch-lightning
2023-03-09 16:46:08 +13:00
blessedcoolant
111d8391e2
Merge branch 'main' into kyle0654/cli_execution_order
2023-03-09 16:37:15 +13:00
blessedcoolant
1157b454b2
decouple default component from react root ( #2897 )
...
Decouple default component from react root
2023-03-09 16:34:47 +13:00
Kyle Schouviller
8a6473610b
[cli] Execute commands in-order with nodes
2023-03-08 19:25:03 -08:00
Elrik
ea7911be89
Merge branch 'main' into chore/accessability_various-additions
2023-03-08 17:15:28 -06:00
Damian Stewart
9ee648e0c3
Merge branch 'main' into feat_longer_prompts
2023-03-09 00:13:01 +01:00
Damian Stewart
543682fd3b
Merge branch 'feat_longer_prompts' of github.com:damian0815/InvokeAI into feat_longer_prompts
2023-03-08 23:24:41 +01:00
Damian Stewart
88cb63e4a1
pin new compel version
2023-03-08 23:24:30 +01:00
Lincoln Stein
76212d1cca
Merge branch 'main' into bugfix/restore-pytorch-lightning
2023-03-08 17:05:11 -05:00