fix(backend): use random seed for SDE & Ancestral schedulers

SDE and Ancestral schedulers use some randomness at each step when applying conditioning. We were using a static seed of `0` for this, regardless of the initial noise used. This could cause results to be a bit same-y.

Unfortunately, we do not have easy access to the seed used to create the initial noise at this time.

Changing this to use a random seed value instead of always 0.
This commit is contained in:
psychedelicious 2023-08-06 18:19:54 +10:00
parent d09dfc3e9b
commit 096c17465b

View File

@ -180,6 +180,10 @@ class TextToLatentsInvocation(BaseInvocation):
negative_cond_data = context.services.latents.get(self.negative_conditioning.conditioning_name)
uc = negative_cond_data.conditionings[0].embeds.to(device=unet.device, dtype=unet.dtype)
# for ancestral and sde schedulers
generator = torch.Generator(device=unet.device)
generator.seed()
conditioning_data = ConditioningData(
unconditioned_embeddings=uc,
text_embeddings=c,
@ -198,7 +202,7 @@ class TextToLatentsInvocation(BaseInvocation):
# for ddim scheduler
eta=0.0, # ddim_eta
# for ancestral and sde schedulers
generator=torch.Generator(device=unet.device).manual_seed(0),
generator=generator,
)
return conditioning_data