|
e98f7eda2e
|
Fix total_steps in generation event, order field added
|
2023-08-09 03:34:25 +03:00 |
|
|
f7aec3b934
|
Move conditioning class to backend
|
2023-08-08 23:33:52 +03:00 |
|
|
a7e44678fb
|
Remove legacy/unused code
|
2023-08-08 20:49:01 +03:00 |
|
|
96b7248051
|
Add mask to l2l
|
2023-08-08 18:50:36 +03:00 |
|
|
1db2c93f75
|
Fix preview, inpaint
|
2023-08-07 21:27:32 +03:00 |
|
|
2539e26c18
|
Apply denoising_start/end, add torch-sdp to memory effictiend attention func
|
2023-08-07 19:57:11 +03:00 |
|
|
b0738b7f70
|
Fixes, zero tensor for empty negative prompt, remove raw prompt node
|
2023-08-07 18:37:06 +03:00 |
|
|
9aaf67c5b4
|
wip
|
2023-08-06 05:05:25 +03:00 |
|
|
b6e369c745
|
chore: black
|
2023-08-05 12:28:35 +10:00 |
|
|
ecabfc252b
|
devices.py - Update MPS FP16 check to account for upcoming MacOS Sonoma
float16 doesn't seem to work on MacOS Sonoma due to further changes with Metal. This'll default back to float32 for Sonoma users.
|
2023-08-05 12:28:35 +10:00 |
|
|
eb6c317f04
|
chore: black
|
2023-08-05 12:05:24 +10:00 |
|
|
6d7223238f
|
fix: fix typo in message
|
2023-08-05 12:05:24 +10:00 |
|
|
8607d124c5
|
improve message about the consequences of the --ignore_missing_core_models flag
|
2023-08-05 12:05:24 +10:00 |
|
|
23497bf759
|
add --ignore_missing_core_models CLI flag to bypass checking for missing core models
|
2023-08-05 12:05:24 +10:00 |
|
|
9bacd77a79
|
Merge branch 'main' into bugfix/fp16-models
|
2023-08-05 01:42:43 +03:00 |
|
|
1b158f62c4
|
resolve vae overrides correctly
|
2023-08-04 18:24:47 -04:00 |
|
|
6ad565d84c
|
folded in changes from 4099
|
2023-08-04 18:24:47 -04:00 |
|
|
04229082d6
|
Provide ti name from model manager, not from ti itself
|
2023-08-04 18:24:47 -04:00 |
|
|
f0613bb0ef
|
Fix merge conflict resolve - restore full/diff layer support
|
2023-08-04 19:53:27 +03:00 |
|
|
0e9f92b868
|
Merge branch 'main' into feat/sdxl_lora
|
2023-08-04 19:22:13 +03:00 |
|
|
7d0cc6ec3f
|
chore: black
|
2023-08-05 02:04:22 +10:00 |
|
|
2f8b928486
|
Add support for diff/full lora layers
|
2023-08-05 02:04:22 +10:00 |
|
|
1d5d187ba1
|
model probe detects sdxl lora models
|
2023-08-04 11:44:56 -04:00 |
|
|
1ac14a1e43
|
add sdxl lora support
|
2023-08-04 11:44:56 -04:00 |
|
|
446fb4a438
|
blackify
|
2023-08-03 19:24:23 -04:00 |
|
|
ab5d938a1d
|
use variant instead of revision
|
2023-08-03 19:23:52 -04:00 |
|
|
9942af756a
|
Merge branch 'main' into remove-onnx-model-check-from-pipeline-download
|
2023-08-03 10:10:51 -04:00 |
|
|
e080fd1e08
|
blackify
|
2023-08-03 11:25:20 +10:00 |
|
|
eeef1e08f8
|
restore ability to convert merged inpaint .safetensors files
|
2023-08-03 11:25:20 +10:00 |
|
|
4e0949fa55
|
fix .swap() by reverting improperly merged @classmethod change
|
2023-08-03 10:00:43 +10:00 |
|
|
ed76250dba
|
Stop checking for unet/model.onnx when a model_index.json is detected
|
2023-08-02 07:21:21 -04:00 |
|
|
4d22cafdad
|
Installer should download fp16 models if user has specified 'auto' in config
- Closes #4127
|
2023-08-01 22:06:27 -04:00 |
|
|
746afcd235
|
Merge branch 'main' into feat/onnx
|
2023-07-31 16:56:34 -04:00 |
|
|
aeac557c41
|
Run python black, point out that onnx is an alpha feature in the installer
|
2023-07-31 16:47:48 -04:00 |
|
|
f5ac73b091
|
Merge branch 'main' into feat/onnx
|
2023-07-31 10:58:40 -04:00 |
|
|
2c07f54b6e
|
Merge branch 'main' into fix-optional
|
2023-07-31 16:31:01 +10:00 |
|
|
83d3f2347e
|
fix "unrecognized arguments: --yes" bug on unattended upgrade
|
2023-07-30 11:07:06 -04:00 |
|
|
50e00feceb
|
Add missing Optional on a few nullable fields.
|
2023-07-30 16:25:12 +02:00 |
|
|
2537ff0280
|
Merge branch 'main' into bugfix/model-manager-rel-paths
|
2023-07-30 08:17:36 -04:00 |
|
|
e20c4dc1e8
|
blackified
|
2023-07-30 08:17:10 -04:00 |
|
|
ac84a9f915
|
reenable display of autoloaded models
|
2023-07-30 08:05:05 -04:00 |
|
|
844578ab88
|
fix lora loading crash
|
2023-07-30 07:57:10 -04:00 |
|
|
73f3b7f84b
|
remove dangling comment
|
2023-07-29 17:32:33 -04:00 |
|
|
348bee8981
|
blackified
|
2023-07-29 17:30:54 -04:00 |
|
|
e82eb0b9fc
|
add correct optional annotation to precision arg
|
2023-07-29 17:30:21 -04:00 |
|
|
1de783b1ce
|
fix mistake in indexing flat_ema_key
|
2023-07-29 17:20:26 -04:00 |
|
|
3f9105be50
|
make convert script respect setting of use_ema in config file
|
2023-07-29 17:17:45 -04:00 |
|
|
2a2d988928
|
convert script handles more ckpt variants
|
2023-07-29 15:28:39 -04:00 |
|
|
72c519c6ad
|
fix incorrect key construction
|
2023-07-29 13:51:47 -04:00 |
|
|
24b19166dd
|
further refactoring
|
2023-07-29 13:13:22 -04:00 |
|