|
b301785dc8
|
Normalize the T5 model identifiers so that a FLUX T5 or an SD3 T5 model can be used interchangeably.
|
2025-01-16 08:33:58 +11:00 |
|
|
bcd29c5d74
|
Remove all cases where we check the 'model.device'. This is no longer trustworthy now that partial loading is permitted.
|
2025-01-07 00:31:00 +00:00 |
|
|
046d19446c
|
Rename Structural Lora to Control Lora
|
2024-12-17 07:28:45 -05:00 |
|
|
f53da60b84
|
Lots of updates centered around using the lora patcher rather than changing the modules in the transformer model
|
2024-12-17 07:28:45 -05:00 |
|
|
5a035dd19f
|
Support bnb quantized nf4 flux models, Use controlnet vae, only support 1 structural lora per transformer. various other refractors and bugfixes
|
2024-12-17 07:28:45 -05:00 |
|
|
f3b253987f
|
Initial setup for flux tools control loras
|
2024-12-17 07:28:45 -05:00 |
|
|
db9c0cad7c
|
Replace custom RMSNorm implementation with torch.nn.functional.rms_norm(...) for improved speed.
|
2024-11-29 12:32:50 -05:00 |
|
|
e85c3bc465
|
Add FLUX VAE support to ImageToLatentsInvocation.
|
2024-09-02 09:38:17 -04:00 |
|
|
a808ce81fd
|
Replace swish() with torch.nn.functional.silu(h). They are functionally equivalent, but in my test VAE deconding was ~8% faster after the change.
|
2024-08-26 20:17:50 -04:00 |
|
|
94aba5892a
|
Attribute black-forest-labs/flux for much of the flux code
|
2024-08-26 20:17:50 -04:00 |
|
|
57168d719b
|
Fix styling/lint
|
2024-08-26 20:17:50 -04:00 |
|
|
1bd90e0fd4
|
Run ruff, setup initial text to image node
|
2024-08-26 20:17:50 -04:00 |
|
|
436f18ff55
|
Add backend functions and classes for Flux implementation, Update the way flux encoders/tokenizers are loaded for prompt encoding, Update way flux vae is loaded
|
2024-08-26 20:17:50 -04:00 |
|