psychedelicious
e76cc71e81
fix(config): edge cases in models.yaml migration
...
When running the configurator, the `legacy_models_conf_path` was stripped when saving the config file. Then the migration logic didn't fire correctly, and the custom models.yaml paths weren't migrated into the db.
- Rework the logic to migrate this path by adding it to the config object as a normal field that is not excluded from serialization.
- Rearrange the models.yaml migration logic to remove the legacy path after migrating, then write the config file. This way, the legacy path doesn't stick around.
- Move the schema version into the config object.
- Back up the config file before attempting migration.
- Add tests to cover this edge case
2024-03-19 09:24:28 +11:00
psychedelicious
5179587b5a
feat(config): restore ignore_missing_core_models arg
2024-03-19 09:24:28 +11:00
psychedelicious
cb180909f7
fix(install): resolve config-related issues with configurator
...
- Do not use the singleton app config, this causes a lot of weirdness
- Update logic to use new config object
- Remove unused code
2024-03-19 09:24:28 +11:00
psychedelicious
60492500db
chore: ruff
2024-03-19 09:24:28 +11:00
psychedelicious
f69938c6a8
fix(config): revised config methods
...
- `write_file` requires an destination file path
- `read_config` -> `merge_from_file`, if no path is provided, reads from `self.init_file_path`
- update app, tests to use new methods
- fix configurator, was overwriting config file data unexpectedly
2024-03-19 09:24:28 +11:00
psychedelicious
77b86e9ad5
fix(install): remove broken v2.3 -> v3 migration logic from configurator
2024-03-19 09:24:28 +11:00
psychedelicious
a6181b5759
fix(install): update configurator to use new config system
2024-03-19 09:24:28 +11:00
psychedelicious
b4b0af7c60
fix(install): do not use deprecated pydantic methods
2024-03-19 09:24:28 +11:00
psychedelicious
b8c46fb15b
fix(config): split check_invokeai_root into separate function to validate, use this in model_install to determine if need to run configurator
2024-03-19 09:24:28 +11:00
psychedelicious
9539ecce79
fix(config): use correct config in textual_inversion_training
2024-03-19 09:24:28 +11:00
psychedelicious
7716a4a8c7
fix(config): use correct config in install_helper
2024-03-19 09:24:28 +11:00
psychedelicious
897fe497dc
fix(config): use new get_config across the app, use correct settings
2024-03-19 09:24:28 +11:00
psychedelicious
7b1f9409bc
fix(config): drop nonexistent config.use_cpu
setting
2024-03-19 09:24:28 +11:00
psychedelicious
a72cea014c
fix(config): drop usage of deprecated config.xformers
, just use the existing utility function
2024-03-19 09:24:28 +11:00
psychedelicious
b4182b190f
fix(config): use new config.patchmatch
2024-03-19 09:24:28 +11:00
psychedelicious
22ac204678
fix(config): fix invisible_watermark handling
...
This setting was hardcoded to True. Simplified logic around it to not have a conditional that does nothing.
2024-03-19 09:24:28 +11:00
psychedelicious
fbe3afa5e1
fix(config): fix nsfw_checker handling
...
This setting was hardcoded to True. Rework logic around it to not conditionally check the setting.
2024-03-19 09:24:28 +11:00
Lincoln Stein
71a1740740
Remove core safetensors->diffusers conversion models
...
- No longer install core conversion models. Use the HuggingFace cache to load
them if and when needed.
- Call directly into the diffusers library to perform conversions with only shallow
wrappers around them to massage arguments, etc.
- At root configuration time, do not create all the possible model subdirectories,
but let them be created and populated at model install time.
- Remove checks for missing core conversion files, since they are no
longer installed.
2024-03-17 19:13:18 -04:00
psychedelicious
fed1f983db
fix(nodes): depth anything processor ( #5956 )
...
We were passing a PIL image when we needed to pass the np image.
Closes #5956
2024-03-14 20:14:53 +11:00
Lincoln Stein
1bd8e33f8c
Work around missing core conversion model issue
...
- This adds additional logic to the safetensors->diffusers conversion script
to check for and install missing core conversion models at runtime.
- Fixes #5934
2024-03-14 16:52:01 +11:00
psychedelicious
3fd824306c
feat(mm): probe for main model default settings
...
Currently, this is just the width and height, derived from the model base.
2024-03-14 16:03:37 +11:00
psychedelicious
b9f1a4bd65
feat(nodes): add w/h defaults for models
2024-03-14 16:03:37 +11:00
psychedelicious
731942dbed
feat(nodes): add constraints & descriptions to default settings
2024-03-14 16:03:37 +11:00
psychedelicious
4117cea5bf
tidy(mm): remove misplaced comment
2024-03-14 15:54:42 +11:00
psychedelicious
9fcd67b5c0
feat(mm): add algorithm prefix to hashes
...
For example:
- md5:a0cd925fc063f98dbf029eee315060c3
- sha1:9e362940e5603fdc60566ea100a288ba2fe48b8c
- blake3:ce3f0c5f3c05d119f4a5dcaf209b50d3149046a0d3a9adee9fed4c83cad6b4d0
2024-03-14 15:54:42 +11:00
psychedelicious
eb6e6548ed
feat(mm): faster hashing for spinning disk HDDs
...
BLAKE3 has poor performance on spinning disks when parallelized. See https://github.com/BLAKE3-team/BLAKE3/issues/31
- Replace `skip_model_hash` setting with `hashing_algorithm`. Any algorithm we support is accepted.
- Add `random` algorithm: hashes a UUID with BLAKE3 to create a random "hash". Equivalent to the previous skip functionality.
- Add `blake3_single` algorithm: hashes on a single thread using BLAKE3, fixes the aforementioned performance issue
- Update model probe to accept the algorithm to hash with as an optional arg, defaulting to `blake3`
- Update all calls of the probe to use the app's configured hashing algorithm
- Update an external script that probes models
- Update tests
- Move ModelHash into its own module to avoid circuclar import issues
2024-03-14 15:54:42 +11:00
blessedcoolant
af660163ca
chore: cleanup DepthAnything code
2024-03-13 20:35:52 +05:30
psychedelicious
01207a2fa5
fix(mm): config.json
to indicates diffusers model
2024-03-13 21:02:29 +11:00
Jennifer Player
d0800c4888
ui consistency, moved is_diffusers logic to backend, extended HuggingFaceMetadata, removed logic from service
2024-03-13 21:02:29 +11:00
Jennifer Player
3a5314f1ca
install model if diffusers or single file, cleaned up backend logic to not mess with existing model install
2024-03-13 21:02:29 +11:00
Jennifer Player
f7cd3cf1f4
added hf models import tab and route for getting available hf models
2024-03-13 21:02:29 +11:00
Brandon Rising
7b393656de
Update l2i invoke and seamless to support AutoencoderTiny, remove attention processors if no mid_block is detected
2024-03-12 12:00:24 -04:00
Ryan Dick
9ee2e7ff25
Do not override log_memory_usage when debug logs are enabled. The speed cost of log_memory_usage=True is large. It is common to want debug log without enabling log_memory_usage.
2024-03-12 09:48:50 +11:00
Brandon Rising
149ff758b9
Run ruff
2024-03-11 15:53:00 -04:00
Brandon Rising
65d415d5aa
Remove redundant with_suffix call
2024-03-11 15:53:00 -04:00
Brandon Rising
c74c1927ec
Gracefully error without deleting invokeai.yaml
2024-03-11 15:53:00 -04:00
Ryan Dick
145bb45858
Remove dead code related to an old symmetry feature.
2024-03-10 00:13:18 -06:00
psychedelicious
c47dbf7258
docs(mm): format docstrings for ModelSearch
2024-03-10 12:09:47 +11:00
psychedelicious
92b2e8186a
tidy(mm): simplify types for ModelSearch
...
- Use `set` instead of `Set`
- Methods accept only `Path`s
2024-03-10 12:09:47 +11:00
psychedelicious
70a88c6b99
docs(mm): update docstrings for ModelSearch
2024-03-10 12:09:47 +11:00
psychedelicious
56e7c04475
tidy(mm): remove extraneous dependencies in model search
...
- `config` is unused
- `stats` is created on instantiation
- `logger` uses the app logger
2024-03-10 12:09:47 +11:00
psychedelicious
bd5b43c00d
tidy(mm): ModelSearch cleanup
...
- No need for it to by a pydantic model. Just a class now.
- Remove ABC, it made it hard to understand what was going on as attributes were spread across the ABC and implementation. Also, there is no other implementation.
- Add tests
2024-03-10 12:09:47 +11:00
Brandon Rising
8ba4b2a150
Run ruff
2024-03-08 15:36:14 -05:00
Brandon Rising
df12e12e09
Run ruff
2024-03-08 15:36:14 -05:00
Brandon Rising
e52274ecac
Experiment with using absolute paths within model management
2024-03-08 15:36:14 -05:00
psychedelicious
a10dccdd43
fix(mm): fix bug in control adapter probe default settings
...
Wasn't checking for matches correctly.
2024-03-08 12:44:58 -05:00
psychedelicious
50bb9a6b41
fix(mm): remove default settings from IP adapter config
2024-03-08 12:44:58 -05:00
psychedelicious
13bb3c5e15
feat(mm): add control adapter default settings while probing
2024-03-08 12:44:58 -05:00
psychedelicious
80c2a4b925
feat(mm): add AnyDefaultSettings union
2024-03-08 12:44:58 -05:00
psychedelicious
8ce485b036
feat(mm): add default settings for control adapters
...
Only includes `preprocessor` at this time.
2024-03-08 12:44:58 -05:00
psychedelicious
6fc3e86061
tidy(mm): only main models get the main default settings
2024-03-08 12:44:58 -05:00
Brandon Rising
b6065d6328
Run ruff with newest version of ruff
2024-03-08 13:59:59 +11:00
Brandon Rising
04229f4a21
Run ruff
2024-03-08 13:59:59 +11:00
Brandon Rising
952d97741e
Remove civit ai from tests and documentation
2024-03-08 13:59:59 +11:00
Brandon Rising
d1f859a446
Remove civit AI model install resources
2024-03-08 13:59:59 +11:00
psychedelicious
bbcbcd9b63
fix(mm): only loras and main models get trigger_phrases
2024-03-08 12:26:35 +11:00
Jennifer Player
4af5a09a68
cleanup
2024-03-06 21:57:41 -05:00
Jennifer Player
aa88fadc30
use webp images
2024-03-06 21:57:41 -05:00
Jennifer Player
8411029d93
get model image url from model config, added thumbnail formatting for images
2024-03-06 21:57:41 -05:00
Jennifer Player
8a68355926
got model images displaying, still need to clean up types and unused code
2024-03-06 21:57:41 -05:00
Jennifer Player
2f6964bfa5
fetching model image, still not working
2024-03-06 21:57:41 -05:00
psychedelicious
132790eebe
tidy(nodes): use canonical capitalizations
2024-03-07 10:56:59 +11:00
psychedelicious
afd9ae7712
tidy(mm): remove convenience methods from high level model manager service
...
These were added as a hold-me-over for the nodes API changes, no longer needed. A followup commit will fix the nodes API to not rely on these.
2024-03-07 10:56:59 +11:00
Brandon Rising
46f32c5e3c
Remove references to the no longer existing invokeai.app.services.model_metadata package
2024-03-05 19:58:25 -05:00
psychedelicious
e866d90ab2
tidy(mm): remove unused method on probe
2024-03-05 23:50:19 +11:00
psychedelicious
e8797787cf
fix(mm): fix incorrect calls to update_model
2024-03-05 23:50:19 +11:00
psychedelicious
7c9128b253
tidy(mm): use canonical capitalization for all model-related enums, classes
...
For example, "Lora" -> "LoRA", "Vae" -> "VAE".
2024-03-05 23:50:19 +11:00
psychedelicious
3030a34b88
fix(mm): make type
and format
required in openapi schema for model config
2024-03-05 23:50:19 +11:00
psychedelicious
a8b6635050
fix(mm): make key
required in openapi schema for model config
2024-03-05 23:50:19 +11:00
psychedelicious
5551cf8ac4
feat(mm): revise update_model to use ModelRecordChanges
2024-03-05 23:50:19 +11:00
psychedelicious
37b969d339
tidy(mm): add default_settings to model config
2024-03-05 23:50:19 +11:00
psychedelicious
c953e61294
tidy(mm): "trigger_words" -> "trigger_phrases"
2024-03-05 23:50:19 +11:00
psychedelicious
93dd3c848e
tidy(mm): remove unused code in select_hf_files.py
2024-03-05 23:50:19 +11:00
psychedelicious
3391c19926
chore: ruff
2024-03-05 23:50:19 +11:00
psychedelicious
0f60b1ced4
fix(mm): use .value
for model config discriminators
...
There is a breaking change in python 3.11 related to how enums with `str` as a mixin are formatted. This appears to have not caused any grief for us until now.
Re-jigger the discriminator setup to use `.value` so everything works on both python 3.10 and 3.11.
2024-03-05 23:50:19 +11:00
psychedelicious
44c40d7d1a
refactor(mm): remove unused metadata logic, fix tests
...
- Metadata is merged with the config. We can simplify the MM substantially and remove the handling for metadata.
- Per discussion, we don't have an ETA for frontend implementation of tags, and with the realization that the tags from CivitAI are largely useless, there's no reason to keep tags in the MM right now. When we are ready to implement tags on the frontend, we can refer back to the implementation here and use it if it supports the design.
- Fix all tests.
2024-03-05 23:50:19 +11:00
psychedelicious
c3aa985c93
refactor(mm): get metadata working
2024-03-05 23:50:19 +11:00
psychedelicious
7cb0da1f66
refactor(mm): wip schema changes
2024-03-05 23:50:19 +11:00
psychedelicious
3534366146
fix(mm): fix extraneous downloaded files in diffusers
...
Sometimes, diffusers model components (tokenizer, unet, etc.) have multiple weights files in the same directory.
In this situation, we assume the files are different versions of the same weights. For example, we may have multiple
formats (`.bin`, `.safetensors`) with different precisions. When downloading model files, we want to select only
the best of these files for the requested format and precision/variant.
The previous logic assumed that each model weights file would have the same base filename, but this assumption was
not always true. The logic is revised score each file and choose the best scoring file, resulting in only a single
file being downloaded for each submodel/subdirectory.
2024-03-05 23:50:19 +11:00
psychedelicious
f2b5f8753f
tidy(mm): remove json_schema_extra from config - not needed
2024-03-05 23:50:19 +11:00
psychedelicious
94e1e64296
chore: ruff
2024-03-05 23:50:19 +11:00
psychedelicious
2411bf53c0
tidy(mm): better descriptions for model configs
2024-03-05 23:50:19 +11:00
psychedelicious
9378e47a06
feat(mm): add source_type
to model configs
2024-03-05 23:50:19 +11:00
psychedelicious
4471ea8ad1
refactor(mm): simplify model metadata schemas
2024-03-05 23:50:19 +11:00
psychedelicious
61b737bb9f
tidy(mm): remove update
method from ModelConfigBase
...
It's only used in the soon-to-be-removed model merge logic
2024-03-05 23:50:19 +11:00
psychedelicious
a8cd3dfc99
refactor(mm): add models
table (schema WIP), rename "original_hash" -> "hash"
2024-03-05 23:50:19 +11:00
psychedelicious
0cce582f2f
tidy(mm): remove current_hash
2024-03-05 23:50:19 +11:00
psychedelicious
bd4fd9693d
tidy(mm): rename ckpt "last_modified" -> "converted_at"
...
Clarify what this timestamp means
2024-03-05 23:50:19 +11:00
psychedelicious
9b40c28144
tidy(mm): rename ckpy "config" -> "config_path"
2024-03-05 23:50:19 +11:00
psychedelicious
16a5d718bf
fix(mm): add config
field to ckpt vaes
2024-03-05 23:50:19 +11:00
psychedelicious
76cbc745e1
refactor(mm): add CheckpointConfigBase
for all ckpt models
2024-03-05 23:50:19 +11:00
psychedelicious
0a614943f6
fix(mm): fix broken get_model_discriminator_value
2024-03-05 23:50:19 +11:00
psychedelicious
e426096d32
fix(mm): misc typing fixes for model loaders
2024-03-05 23:50:19 +11:00
psychedelicious
c561cd751f
fix(mm): use correct import path for ConfigMixin, ModelMixin
2024-03-05 23:50:19 +11:00
psychedelicious
af9298f0ef
tidy(mm): tidy class names in config.py
2024-03-05 23:50:19 +11:00
psychedelicious
5b74117836
fix(mm): use generic for model loader registry
...
This preserves the typing for classes using the decorator
2024-03-05 23:50:19 +11:00
psychedelicious
38474c9797
fix(mm): use correct import path for ModelMixin
2024-03-05 23:50:19 +11:00
psychedelicious
b880a31039
refactor(mm): remove ztsnr_training
field on _MainConfig
...
This is used to determine the CFG Rescale Multiplier setting. We'll handle this in the UI as a default setting.
2024-03-05 23:50:19 +11:00
psychedelicious
dd31bc4586
refactor(mm): remove vae
field on _MainConfig
...
We will handle default VAE selection in the UI.
2024-03-05 23:50:19 +11:00
psychedelicious
316573df2d
feat(mm): use callable discriminator for AnyModelConfig
union
2024-03-05 23:50:19 +11:00