Commit Graph

115 Commits

Author SHA1 Message Date
damian0815
b17ca0a5e7 don't suppress exceptions when doing cross-attention control 2022-11-09 10:16:30 -05:00
damian0815
71bbfe4a1a Fix #1362 by improving VRAM usage patterns when doing .swap()
commit ef3f7a26e242b73c2beb0195c7fd8f654ef47f55
Author: damian0815 <null@damianstewart.com>
Date:   Tue Nov 8 12:18:37 2022 +0100

    remove log spam

commit 7189d649622d4668b120b0dd278388ad672142c4
Author: damian0815 <null@damianstewart.com>
Date:   Tue Nov 8 12:10:28 2022 +0100

    change the way saved slicing strategy is applied

commit 01c40f751ab72955140165c16f95ae411732265b
Author: damian0815 <null@damianstewart.com>
Date:   Tue Nov 8 12:04:43 2022 +0100

    fix slicing_strategy_getter callsite

commit f8cfe25150a346958903316bc710737d99839923
Author: damian0815 <null@damianstewart.com>
Date:   Tue Nov 8 11:56:22 2022 +0100

    cleanup, consistent dim=0 also tested

commit 5bf9b1e890d48e962afd4a668a219b68271e5dc1
Author: damian0815 <null@damianstewart.com>
Date:   Tue Nov 8 11:34:09 2022 +0100

    refactored context, tested with non-sliced cross attention control

commit d58a46e39bf562e7459290d2444256e8c08ad0b6
Author: damian0815 <null@damianstewart.com>
Date:   Sun Nov 6 00:41:52 2022 +0100

    cleanup

commit 7e2c658b4c06fe239311b65b9bb16fa3adec7fd7
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:57:31 2022 +0100

    disable logs

commit 20ee89d93841b070738b3d8a4385c93b097d92eb
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:36:58 2022 +0100

    slice saved attention if necessary

commit 0a7684a22c880ec0f48cc22bfed4526358f71546
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:32:38 2022 +0100

    raise instead of asserting

commit 7083104c7f3a0d8fd96e94a2f391de50a3c942e4
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:31:00 2022 +0100

    store dim when saving slices

commit f7c0808ed383ec1dc70645288a798ed2aa4fa85c
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:27:16 2022 +0100

    don't retry on exception

commit 749a721e939b3fe7c1741e7998dab6bd2c85a0cb
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:24:50 2022 +0100

    stuff

commit 032ab90e9533be8726301ec91b97137e2aadef9a
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:20:17 2022 +0100

    more logging

commit 3dc34b387f033482305360e605809d95a40bf6f8
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:16:47 2022 +0100

    logs

commit 901c4c1aa4b9bcef695a6551867ec8149e6e6a93
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:12:39 2022 +0100

    actually set save_slicing_strategy to True

commit f780e0a0a7c6b6a3db320891064da82589358c8a
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 22:10:35 2022 +0100

    store slicing strategy

commit 93bb6d566fd18c5c69ef7dacc8f74ba2cf671cb7
Author: damian <git@damianstewart.com>
Date:   Sat Nov 5 20:43:48 2022 +0100

    still not it

commit 5e3a9541f8ae00bde524046963910323e20c40b7
Author: damian <git@damianstewart.com>
Date:   Sat Nov 5 17:20:02 2022 +0100

    wip offloading attention slices on-demand

commit 4c2966aa856b6f3b446216da3619ae931552ef08
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 15:47:40 2022 +0100

    pre-emptive offloading, idk if it works

commit 572576755e9f0a878d38e8173e485126c0efbefb
Author: root <you@example.com>
Date:   Sat Nov 5 11:25:32 2022 +0000

    push attention slices to cpu. slow but saves memory.

commit b57c83a68f2ac03976ebc89ce2ff03812d6d185f
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 12:04:22 2022 +0100

    verbose logging

commit 3a5dae116f110a96585d9eb71d713b5ed2bc3d2b
Author: damian0815 <null@damianstewart.com>
Date:   Sat Nov 5 11:50:48 2022 +0100

    wip fixing mem strategy crash (4 test on runpod)

commit 3cf237db5fae0c7b0b4cc3c47c81830bdb2ae7de
Author: damian0815 <null@damianstewart.com>
Date:   Fri Nov 4 09:02:40 2022 +0100

    wip, only works on cuda
2022-11-09 10:16:21 -05:00
damian0815
688d7258f1 fix a bug that broke cross attention control index mapping 2022-11-02 13:54:54 +13:00
damian0815
4513320bf1 save VRAM by not recombining tensors that have been sliced to save VRAM 2022-11-02 13:54:54 +13:00
damian0815
349cc25433 fix crash (be a little less aggressive clearing out the attention slice) 2022-11-01 17:34:28 -04:00
damian0815
214d276379 be more aggressive at clearing out saved_attn_slice 2022-11-01 17:34:28 -04:00
Damian at mba
ced9c83e96 various prompting fixes 2022-10-31 09:34:56 -04:00
Lincoln Stein
231dfe01f4 fix incorrect thresholding reporting for karras noise; close #1300 2022-10-30 10:35:55 -04:00
Lincoln Stein
19a6e904ec resolved whitespace difference 2022-10-27 17:12:22 -04:00
Lincoln Stein
1200fbd3bd add threshold for switchover from Karras to LDM noise schedule 2022-10-27 17:07:50 -04:00
Lincoln Stein
d25bf7a55a cut over from karras to model noise schedule for higher steps
The k_samplers come with a "karras" noise schedule which performs
very well at low step counts but becomes noisy at higher ones.

This commit introduces a threshold (currently 30 steps) at which the
k samplers will switch over from using karras to the older model
noise schedule.
2022-10-27 17:06:49 -04:00
Lincoln Stein
943616044a Merge branch 'switch-ksampler-noise-scheduler-adaptively' into development
- This sets a step switchover point at which the k-samplers stop using the
  Karras noise schedule and start using the LatentDiffusion noise schedule.
  The advantage of this is that the Karras schedule produces excellent
  results at low step counts but starts to become unstable at high
  steps.

- A new command argument --karras_max, lets the user set where the
  switchover occurs. Default is 29 steps (1-29 steps Karras),
  (30 or greater LDM)

- Tildebyte, sorry to do a fast forward three-way merge for this
  but rebasing was just too painful due to extensive recent
  changes to the diffuser code.
2022-10-27 16:11:26 -04:00
Lincoln Stein
943808b925 add threshold for switchover from Karras to LDM noise schedule 2022-10-27 15:50:32 -04:00
Damian at mba
f73d349dfe refactor hybrid and cross attention control codepaths for readability 2022-10-27 19:40:37 +02:00
Lincoln Stein
16e7cbdb38 tweaks to documentation and call signature for advanced prompting 2022-10-27 08:30:09 -04:00
Lincoln Stein
799dc6d0df acceptable integration of new prompting system and inpainting
This was a difficult merge because both PR #1108 and #1243 made
changes to obscure parts of the diffusion code.

- prompt weighting, merging and cross-attention working
  - cross-attention does not work with runwayML inpainting
    model, but weighting and merging are tested and working
- CLI command parsing code rewritten in order to get embedded
  quotes right
- --hires now works with runwayML inpainting
- --embiggen does not work with runwayML and will give an error
- Added an --invert option to invert masks applied to inpainting
- Updated documentation
2022-10-27 01:51:35 -04:00
Lincoln Stein
0d0481ce75 inpaint model progress
- working with plain prompts, weighted prompts and merge prompts
- not tested with prompt2prompt
2022-10-26 22:40:01 -04:00
Lincoln Stein
2daf187bdb working with 1.4, 1.5, not with inpainting 1.5 2022-10-26 18:25:48 -04:00
Lincoln Stein
9b7159720f resolve conflicts between PR #1108 and #1243 2022-10-26 15:37:24 -04:00
Lincoln Stein
3c1ef48fe2 fix crash when doing img2img with ddim sampler and SD 1.5 2022-10-25 13:57:42 -04:00
Lincoln Stein
4352eb6628 stop crashes on non-square images 2022-10-25 13:17:06 -04:00
Lincoln Stein
e33971fe2c plms works, bugs quashed
- The plms sampler now works with custom inpainting model
- Quashed bug that was causing generation on normal models to fail (oops!)
- Can now generate non-square images with custom inpainting model

Credits for advice and assistance during porting:

@any-winter-4079 (http://github.com/any-winter-4079)
@db3000 (Danny Beer http://github.com/db3000)
2022-10-25 11:44:01 -04:00
Lincoln Stein
b101be041b add support for runwayML custom inpainting model
This is still a work in progress but seems functional. It supports
inpainting, txt2img and img2img on the ddim and k* samplers (plms
still needs work, but I know what to do).

To test this, get the file `sd-v1-5-inpainting.ckpt' from
https://huggingface.co/runwayml/stable-diffusion-inpainting and place it
at `models/ldm/stable-diffusion-v1/sd-v1-5-inpainting.ckpt`

Launch invoke.py with --model inpainting-1.5 and proceed as usual.

Caveats:

1. The inpainting model takes about 800 Mb more memory than the standard
   1.5 model. This model will not work on 4 GB cards.

2. The inpainting model is temperamental. It wants you to describe the
   entire scene and not just the masked area to replace. So if you want
   to replace the parrot on a man's shoulder with a crow, the prompt
   "crow" may fail. Try "man with a crow on shoulder" instead. The
   symptom of a failed inpainting is that the area will be erased and
   replaced with background.

3. This has not been tested well. Please report bugs.
2022-10-25 10:45:15 -04:00
Lincoln Stein
aaf7a4f1d3 inpaint and txt2img working with ddim sampler 2022-10-25 10:00:28 -04:00
Lincoln Stein
83a3cc9eb4 start support for 1.5 inpainting model, not complete 2022-10-25 00:30:48 -04:00
Damian at mba
0564397ee6 cleanup logs 2022-10-24 11:16:43 +02:00
Damian at mba
1fb15d5c81 fix hires fix 2022-10-24 02:02:42 +02:00
Damian at mba
cc2042bd4c keep the effect of _start and _end arguments consistent across k* and other samplers 2022-10-24 01:43:35 +02:00
Damian at mba
ee4273d760 fix step count on ddim 2022-10-24 01:23:43 +02:00
Damian at mba
8e7d744c60 fix bad math 2022-10-23 19:43:35 +02:00
Damian at mba
8f35819ddf add shape_freedom arg to .swap() 2022-10-23 19:38:31 +02:00
Damian at mba
04d93f0445 for k* samplers, estimate step_index from sigma 2022-10-23 16:26:50 +02:00
Damian at mba
7d677a63b8 cross attention control options 2022-10-23 14:58:25 +02:00
Lincoln Stein
3e48b9ff85 cut over from karras to model noise schedule for higher steps
The k_samplers come with a "karras" noise schedule which performs
very well at low step counts but becomes noisy at higher ones.

This commit introduces a threshold (currently 30 steps) at which the
k samplers will switch over from using karras to the older model
noise schedule.
2022-10-22 23:02:50 -04:00
Damian at mba
64051d081c cleanup 2022-10-21 15:07:11 +02:00
Damian at mba
2bf9f1f0d8 rename StrcuturedConditioning to ExtraConditioningInfo 2022-10-21 12:18:40 +02:00
Damian at mba
4c1267338b bring in attention etc. 2022-10-21 03:54:13 +02:00
Damian at mba
c9d27634b4 bring in prompt parser from fix-prompts branch
attention is parsed but ignored, blends old syntax doesn't work,
	  conjunctions are parsed but ignored, the only part that's used
	  here is the new .blend() syntax and cross-attention control
	  using .swap()
2022-10-20 12:01:48 +02:00
Damian at mba
42883545f9 add prompt language support for cross-attention .swap 2022-10-20 01:42:04 +02:00
Damian at mba
c6ae9f1176 remove unnecessary assertion 2022-10-19 21:12:07 +02:00
Damian at mba
c3b992db96 Squashed commit of the following:
commit 9bb0b5d0036c4dffbb72ce11e097fae4ab63defd
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Sat Oct 15 23:43:41 2022 +0200

    undo local_files_only stuff

commit eed93f5d30c34cfccaf7497618ae9af17a5ecfbb
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Sat Oct 15 23:40:37 2022 +0200

    Revert "Merge branch 'development-invoke' into fix-prompts"

    This reverts commit 7c40892a9f184f7e216f14d14feb0411c5a90e24, reversing
    changes made to e3f2dd62b0548ca6988818ef058093a4f5b022f2.

commit f06d6024e345c69e6d5a91ab5423925a68ee95a7
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 13 23:30:16 2022 +0200

    more efficiently handle multiple conditioning

commit 5efdfcbcd980ce6202ab74e7f90e7415ce7260da
Merge: b9c0dc5 ac08bb6
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 13 14:51:01 2022 +0200

    Merge branch 'optional-disable-karras-schedule' into fix-prompts

commit ac08bb6fd25e19a9d35cf6c199e66500fb604af1
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 13 14:50:43 2022 +0200

    append '*use_model_sigmas*' to prompt string to use model sigmas

commit 70d8c05a3ff329409f76204f4af94e55d468ab8b
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 13 12:12:17 2022 +0200

    make karras scheduling switchable

    commit d60df54f69 replaced the model's
    own scheduling with karras scheduling. this has changed image generation
    (seems worse now?)

    this commit wraps the change in a bool.

commit b9c0dc5f1a658a0e6c3936000e9ae559e1c7a1db
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 20:16:00 2022 +0200

    add test of more complex conjunction

commit 9ac0c15cc0d7b5f6df3289d3ad474260972a17be
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 17:18:25 2022 +0200

    improve comments

commit ad33bce60590b87b2a93e90f16dc9d3e935d04a5
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 17:04:46 2022 +0200

    put back thresholding stuff

commit 4852c698a325049834ba0d4b358f07210bc7171a
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 14:25:02 2022 +0200

    notes on improving conjunction efficiency

commit a53bb1e5b68025d09642b935ae6a9a015cfaf2d6
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 14:14:33 2022 +0200

    optional weights support for Conjunction

commit fec79ab15e4f0c84dd61cb1b45a5e6a72ae4aaeb
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 12:07:27 2022 +0200

    fix blend error and log parsing output

commit 1f751c2a039f9c97af57b18e0f019512631d5a25
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 10:33:33 2022 +0200

    fix broken euler sampler

commit 02f8148d17efe4b6bde8d29b827092a0626363ee
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 10:24:20 2022 +0200

    cleanup prompt parser

commit 8028d49ae6c16c0d6ec9c9de9c12d56c32201421
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Wed Oct 12 10:14:18 2022 +0200

    explicit conjunction, improve flattening logic

commit 8a1710892185f07eb77483f7edae0fc4d6bbb250
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 22:59:30 2022 +0200

    adapt multi-conditioning to also work with ddim

commit 53802a839850d0d1ff017c6bafe457c4bed750b0
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 22:31:42 2022 +0200

    unconditioning is also fancy-prompt-syntaxable

commit 7c40892a9f184f7e216f14d14feb0411c5a90e24
Merge: e3f2dd6 dbe0da4
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 21:39:54 2022 +0200

    Merge branch 'development-invoke' into fix-prompts

commit e3f2dd62b0548ca6988818ef058093a4f5b022f2
Merge: eef0e48 06f542e
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 21:38:09 2022 +0200

    Merge remote-tracking branch 'upstream/development' into fix-prompts

commit eef0e484c2eaa1bd4e0e0b1d3f8d7bba38478144
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 21:26:25 2022 +0200

    fix run-on paren-less attention, add some comments

commit fd29afdf0e9f5e0cdc60239e22480c36ca0aaeca
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 21:03:02 2022 +0200

    python 3.9 compatibility

commit 26f7646eef7f39bc8f7ce805e747df0f723464da
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 20:58:42 2022 +0200

    first pass connecting PromptParser to conditioning

commit ae53dff3796d7b9a5e7ed30fa1edb0374af6cd8d
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 20:51:15 2022 +0200

    update frontend dist

commit 9be4a59a2d76f49e635474b5984bfca826a5dab4
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 19:01:39 2022 +0200

    fix issues with correctness checking FlattenedPrompt

commit 3be212323eab68e72a363a654124edd9809e4cf0
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 18:43:16 2022 +0200

    parsing nested seems to work pretty ok

commit acd73eb08cf67c27cac8a22934754321256f56a9
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 18:26:17 2022 +0200

    wip introducing FlattenedPrompt class

commit 71698d5c7c2ac855b690d8ef67e8830148c59eda
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 15:59:42 2022 +0200

    recursive attention weighting seems to actually work

commit a4e1ec6b20deb7cc0cd12737bdbd266e56144709
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 15:06:24 2022 +0200

    now apparently almost supported nested attention

commit da76fd1ddf22a3888cdc08fd4fed38d8b178e524
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 11 13:23:37 2022 +0200

    wip prompt parsing

commit dbe0da4572c2ac22f26a7afd722349a5680a9e47
Author: Kyle Schouviller <kyle0654@hotmail.com>
Date:   Mon Oct 10 22:32:35 2022 -0700

    Adding node-based invocation apps

commit 8f2a2ffc083366de74d7dae471b50b6f98a7c5f8
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Mon Oct 10 19:03:18 2022 +0200

    fix merge issues

commit 73118dee2a8f4891700756e014caf1c9ca629267
Merge: fd00844 12413b0
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Mon Oct 10 12:42:48 2022 +0200

    Merge remote-tracking branch 'upstream/development' into fix-prompts

commit fd0084413541013c2cf71e006af0392719bef53d
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Mon Oct 10 12:39:38 2022 +0200

    wip prompt parsing

commit 0be9363db9307859d2b65cffc6af01f57d7873a4
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Mon Oct 10 03:20:06 2022 +0200

    better +/- attention parsing

commit 5383f691874a58ab01cda1e4fac6cf330146526a
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Mon Oct 10 02:27:47 2022 +0200

    prompt parser seems to work

commit 591d098a33ce35462428d8c169501d8ed73615ab
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Sun Oct 9 20:25:37 2022 +0200

    supports weighting unconditioning, cross-attention with |

commit 7a7220563aa05a2980235b5b908362f66b728309
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Sun Oct 9 18:15:56 2022 +0200

    i think cross attention might be working?

commit 951ed391e7126bff228c18b2db304ad28d59644a
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Sun Oct 9 16:04:54 2022 +0200

    weighted CFG denoiser working with a single item

commit ee532a0c2827368c9e45a6a5f3975666402873da
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Sun Oct 9 06:33:40 2022 +0200

    wip probably doesn't work or compile

commit 14654bcbd207b9ca28a6cbd37dbd967d699b062d
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Fri Oct 7 18:11:48 2022 +0200

    use tan() to calculate embedding weight for <1 attentions

commit 1a8e76b31aa5abf5150419ebf3b29d4658d07f2b
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Fri Oct 7 16:14:54 2022 +0200

    fix bad math.max reference

commit f697ff896875876ccaa1e5527405bdaa7ed27cde
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Fri Oct 7 15:55:57 2022 +0200

    respect http[s]x protocol when making socket.io middleware

commit 41d3dd4eeae8d4efb05dfb44fc6d8aac5dc468ab
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Fri Oct 7 13:29:54 2022 +0200

    fractional weighting works, by blending with prompts excluding the word

commit 087fb6dfb3e8f5e84de8c911f75faa3e3fa3553c
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Fri Oct 7 10:52:03 2022 +0200

    wip doing weights <1 by averaging with conditioning absent the lower-weighted fragment

commit 3c49e3f3ec7c18dc60f3e18ed2f7f0d97aad3a47
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Fri Oct 7 10:36:15 2022 +0200

    notate CFGDenoiser, perhaps

commit d2bcf1bb522026ebf209ad0103f6b370383e5070
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 6 05:04:47 2022 +0200

    hack blending syntax to test attention weighting more extensively

commit 94904ef2cf917f74ec23ef7a570e12ff8255b048
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 6 04:56:37 2022 +0200

    conditioning works, apparently

commit 7c6663ddd70f665fd1308b6dd74f92ca393a8df5
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Thu Oct 6 02:20:24 2022 +0200

    attention weighting, definitely works in positive direction

commit 5856d453a9b020bc1a28ff643ae1f58c12c9be73
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 4 19:02:14 2022 +0200

    wip bubbling weights down

commit a2ed14fd9b7d3cb36b6c5348018b364c76d1e892
Author: Damian at mba <damian@frey.NOSPAMco.nz>
Date:   Tue Oct 4 17:35:39 2022 +0200

    bring in changes from PC
2022-10-19 21:12:07 +02:00
Damian at mba
1ffd4a9e06 refactored single diffusion path seems to be working for all samplers 2022-10-19 21:08:03 +02:00
Damian at mba
147d39cb7c wip refactoring shared InvokeAI diffuser mixin to component 2022-10-19 21:08:03 +02:00
Damian at mba
824cb201b1 pass img2img ddim/plms edited conditioning through kwargs 2022-10-19 21:08:03 +02:00
Damian at mba
582880b314 add cross-attention support to im2img; prevent inpainting from crashing 2022-10-19 21:08:03 +02:00
Damian at mba
2b79a716aa wip hi-res fix 2022-10-19 21:08:03 +02:00
Damian at mba
d572af2acf fix cross-attention on k* samplers 2022-10-19 21:08:03 +02:00
Damian at mba
54e6a68acb wip bringing cross-attention to PLMS and DDIM 2022-10-19 21:08:03 +02:00
Damian at mba
09f62032ec cleanup and clarify comments 2022-10-19 21:08:03 +02:00
Damian at mba
711ffd238f cleanup 2022-10-19 21:08:03 +02:00