dunkeroni
b2b7aed030
feat(nodes): added gradient mask node
2024-02-22 10:04:33 -05:00
Jonathan
83a9e26cd8
Respect torch-sdp in config.yaml ( #5353 )
...
If the user specifies `torch-sdp` as the attention type in `config.yaml`, we can go ahead and use it (if available) rather than always throwing an exception.
2023-12-28 05:46:28 +00:00
Ryan Dick
5127e9df2d
Fix error caused by bump to diffusers 0.24. We pass kwargs now instead of positional args so that we are more robust to future changes.
2023-12-13 09:17:30 -05:00
Damian Stewart
0beb08686c
Add CFG Rescale option for supporting zero-terminal SNR models ( #4335 )
...
* add support for CFG rescale
* fix typo
* move rescale position and tweak docs
* move input position
* implement suggestions from github and discord
* cleanup unused code
* add back dropped FieldDescription
* fix(ui): revert unrelated UI changes
* chore(nodes): bump denoise_latents version 1.4.0 -> 1.5.0
* feat(nodes): add cfg_rescale_multiplier to metadata node
* feat(ui): add cfg rescale multiplier to linear UI
- add param to state
- update graph builders
- add UI under advanced
- add metadata handling & recall
- regen types
* chore: black
* fix(backend): make `StableDiffusionGeneratorPipeline._rescale_cfg()` staticmethod
This doesn't need access to class.
* feat(backend): add docstring for `_rescale_cfg()` method
* feat(ui): update cfg rescale mult translation string
---------
Co-authored-by: psychedelicious <4822129+psychedelicious@users.noreply.github.com>
2023-11-30 20:55:20 +11:00
psychedelicious
6494e8e551
chore: ruff format
2023-11-11 10:55:40 +11:00
Kent Keirsey
b7555ddae8
Revert "Revert "chore: lint""
...
This reverts commit 38e7eb8878
.
2023-10-17 11:59:19 -04:00
Kent Keirsey
282d36b640
Revert "Revert "Fixing some var and arg names.""
...
This reverts commit 58a0709c1e
.
2023-10-17 11:59:19 -04:00
psychedelicious
58a0709c1e
Revert "Fixing some var and arg names."
...
This reverts commit f11ba81a8d
.
2023-10-17 11:59:11 -04:00
psychedelicious
38e7eb8878
Revert "chore: lint"
...
This reverts commit fff29d663d
.
2023-10-17 11:59:11 -04:00
psychedelicious
fff29d663d
chore: lint
2023-10-17 19:42:06 +11:00
user1
f11ba81a8d
Fixing some var and arg names.
2023-10-17 19:42:06 +11:00
Ryan Dick
971ccfb081
Refactor multi-IP-Adapter to clean up the interface around changing scales.
2023-10-06 20:43:43 -04:00
Ryan Dick
43a3c3c7ea
Fix typo in setting IP-Adapter scales.
2023-10-06 20:43:43 -04:00
Ryan Dick
d8d0c9af09
Fix handling of scales with multiple IP-Adapters.
2023-10-06 20:43:43 -04:00
Ryan Dick
7ca456d674
Update IP-Adapter model to enable running multiple IP-Adapters at once. (Not tested yet.)
2023-10-06 20:43:43 -04:00
Ryan Dick
78828b6b9c
WIP - Accept a list of IPAdapterFields in DenoiseLatents.
2023-10-06 20:43:43 -04:00
Ryan Dick
78377469db
Add support for T2I-Adapter in node workflows ( #4612 )
...
* Bump diffusers to 0.21.2.
* Add T2IAdapterInvocation boilerplate.
* Add T2I-Adapter model to model-management.
* (minor) Tidy prepare_control_image(...).
* Add logic to run the T2I-Adapter models at the start of the DenoiseLatentsInvocation.
* Add logic for applying T2I-Adapter weights and accumulating.
* Add T2IAdapter to MODEL_CLASSES map.
* yarn typegen
* Add model probes for T2I-Adapter models.
* Add all of the frontend boilerplate required to use T2I-Adapter in the nodes editor.
* Add T2IAdapterModel.convert_if_required(...).
* Fix errors in T2I-Adapter input image sizing logic.
* Fix bug with handling of multiple T2I-Adapters.
* black / flake8
* Fix typo
* yarn build
* Add num_channels param to prepare_control_image(...).
* Link to upstream diffusers bugfix PR that currently requires a workaround.
* feat: Add Color Map Preprocessor
Needed for the color T2I Adapter
* feat: Add Color Map Preprocessor to Linear UI
* Revert "feat: Add Color Map Preprocessor"
This reverts commit a1119a00bf
.
* Revert "feat: Add Color Map Preprocessor to Linear UI"
This reverts commit bd8a9b82d8
.
* Fix T2I-Adapter field rendering in workflow editor.
* yarn build, yarn typegen
---------
Co-authored-by: blessedcoolant <54517381+blessedcoolant@users.noreply.github.com>
Co-authored-by: psychedelicious <4822129+psychedelicious@users.noreply.github.com>
2023-10-05 16:29:16 +11:00
blessedcoolant
e0dddbd38e
chore: fix isort issues
2023-09-17 12:13:03 +12:00
blessedcoolant
b7773c9962
chore: black & lint fixes
2023-09-17 12:00:21 +12:00
user1
c48e648cbb
Added per-step setting of IP-Adapter weights (for param easing, etc.)
2023-09-16 12:36:16 -07:00
user1
7ee13879e3
Added check in IP-Adapter to avoid begin/end step percent handling if use of IP-Adapter is already turned off due to potential clash with other cross attention control.
2023-09-16 09:29:50 -07:00
user1
ced297ed21
Initial implementation of IP-Adapter "begin_step_percent" and "end_step_percent" for controlling on which steps IP-Adapter is applied in the denoising loop.
2023-09-16 08:24:12 -07:00
Ryan Dick
b57acb7353
Merge branch 'main' into feat/ip-adapter
2023-09-15 13:15:25 -04:00
Ryan Dick
c2f074dc2f
Fix python static checks.
2023-09-14 16:48:47 -04:00
Ryan Dick
781e8521d5
Eliminate the need for IPAdapter.initialize().
2023-09-14 15:02:59 -04:00
Ryan Dick
1c8991a3df
Use CLIPVisionModel under model management for IP-Adapter.
2023-09-13 19:10:02 -04:00
Ryan Dick
3ee9a21647
Initial (barely) working version of IP-Adapter model management.
2023-09-13 08:27:24 -04:00
Martin Kristiansen
e467ca7f1b
Apply black, isort, flake8
2023-09-12 13:01:58 -04:00
Martin Kristiansen
caea6d11c6
isort wip 2
2023-09-12 13:01:58 -04:00
Ryan Dick
50a0691514
flake8
2023-09-08 18:05:31 -04:00
Ryan Dick
3f7d5b4e0f
Remove redundant IPAdapterXL class.
2023-09-08 15:46:10 -04:00
Ryan Dick
91596d9527
Re-factor IPAdapter to patch UNet in a context manager.
2023-09-08 15:39:22 -04:00
Ryan Dick
b2d5b53b5f
Pass IP-Adapter conditioning via cross_attention_kwargs instead of concatenating to the text embedding. This avoids interference with other features that manipulate the text embedding (e.g. long prompts).
2023-09-08 11:47:36 -04:00
Ryan Dick
ddc148b70b
Move ConditioningData and its field classes to their own file. This will allow new conditioning types to be added more cleanly without introducing circular dependencies.
2023-09-08 11:00:11 -04:00
Ryan Dick
23fdf0156f
Clean up IP-Adapter in diffusers_pipeline.py - WIP
2023-09-06 20:42:20 -04:00
Ryan Dick
d776e0a0a9
Split ControlField and IpAdapterField.
2023-09-06 17:03:37 -04:00
blessedcoolant
3ac68cde66
chore: flake8 cleanup
2023-09-05 12:07:12 +12:00
blessedcoolant
07381e5a26
cleanup: merge conflicts
2023-09-05 11:37:12 +12:00
user1
fb1b03960e
Added IP-Adapter SDXL support. Added IP-Adapter "Plus" (more detail) model support.
2023-09-01 04:40:30 -07:00
user1
74bfb5e1f9
First commit of separate node for IP-Adapter.
...
And it own dataclasses for passing info.
2023-08-31 23:07:15 -07:00
user1
5a9993772d
Added ip_adapter_strength parameter to adjust weighting of IP-Adapter's added cross-attention layers
2023-08-30 17:28:30 -07:00
user1
9f86cfa471
Working POC of IP-Adapters. Not fully nodified yet.
2023-08-30 17:28:30 -07:00
Sergey Borisov
8562dbaaa8
Hotfix to make second order schedulers work with mask
2023-08-30 02:18:08 +03:00
StAlKeR7779
3e6c49001c
Change antialias to True as input - image
...
Co-authored-by: Lincoln Stein <lincoln.stein@gmail.com>
2023-08-28 02:54:39 +03:00
Sergey Borisov
526c7e7737
Provide antialias argument as behaviour will be changed in future(deprecation warning)
2023-08-27 20:04:55 +03:00
blessedcoolant
e9633a3adb
Merge branch 'main' into fix/inpaint_gen
2023-08-27 02:54:19 +12:00
psychedelicious
3c43594c26
Merge branch 'main' into fix/inpaint_gen
2023-08-18 15:57:48 +10:00
Martin Kristiansen
537ae2f901
Resolving merge conflicts for flake8
2023-08-18 15:52:04 +10:00
Sergey Borisov
cfd827cfad
Added node for creating mask inpaint
2023-08-18 04:07:40 +03:00
Lincoln Stein
b69f26c85c
add support for "balanced" attention slice size
2023-08-17 16:11:09 -04:00