fix(nodes): do not set seed on output latents from denoise latents

`LatentsField` objects have an optional `seed` field. This should only be populated when the latents are noise, generated from a seed.

`DenoiseLatentsInvocation` needs a seed value for scheduler initialization. It's used in a few places, and there is some logic for determining the seed to use with a series of fallbacks:
- Use the seed from the noise (a `LatentsField` object)
- Use the seed from the latents (a `LatentsField` object - normally it won't have a seed)
- Use `0` as a final fallback

In `DenoisLatentsInvocation`, we set the seed in the `LatentsOutput`, even though the output latents are not noise.

This is normally fine, but when we use refiner, we re-use the those same latents for the refiner denoise. This causes that characteristic same-seed-fried look on the refiner pass.

Simple fix - do not set the field in the output latents.
This commit is contained in:
psychedelicious 2024-04-11 14:50:22 +10:00 committed by Kent Keirsey
parent 7e2ade50e1
commit 026d095afe

View File

@ -964,7 +964,7 @@ class DenoiseLatentsInvocation(BaseInvocation):
mps.empty_cache() mps.empty_cache()
name = context.tensors.save(tensor=result_latents) name = context.tensors.save(tensor=result_latents)
return LatentsOutput.build(latents_name=name, latents=result_latents, seed=seed) return LatentsOutput.build(latents_name=name, latents=result_latents, seed=None)
@invocation( @invocation(