Fix wrong conditioning used (#3595)

As it said in comment to this branch we want to use conditioning run:
```python
if cfg_injection:  # only applying ControlNet to conditional instead of in unconditioned
```
But in code used unconditioning
embeddings(`conditioning_data.unconditioned_embeddings`).

Later in code confirms that we want to run conditioning generation by
comment and tensor concatenation order(as all code expect to get [uc, c]
tensor):
```python
if cfg_injection:
    # Inferred ControlNet only for the conditional batch.
    # To apply the output of ControlNet to both the unconditional and conditional batches,
    #   add 0 to the unconditional batch to keep it unchanged.
    down_samples = [torch.cat([torch.zeros_like(d), d]) for d in down_samples]
    mid_sample = torch.cat([torch.zeros_like(mid_sample), mid_sample])
```
This commit is contained in:
Lincoln Stein 2023-07-13 09:19:13 -04:00 committed by GitHub
commit 8fc0ce7e38
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -631,7 +631,7 @@ class StableDiffusionGeneratorPipeline(StableDiffusionPipeline):
control_latent_input = torch.cat([unet_latent_input] * 2) control_latent_input = torch.cat([unet_latent_input] * 2)
if cfg_injection: # only applying ControlNet to conditional instead of in unconditioned if cfg_injection: # only applying ControlNet to conditional instead of in unconditioned
encoder_hidden_states = torch.cat([conditioning_data.unconditioned_embeddings]) encoder_hidden_states = conditioning_data.text_embeddings
else: else:
encoder_hidden_states = torch.cat([conditioning_data.unconditioned_embeddings, encoder_hidden_states = torch.cat([conditioning_data.unconditioned_embeddings,
conditioning_data.text_embeddings]) conditioning_data.text_embeddings])