Commit Graph

10288 Commits

Author SHA1 Message Date
psychedelicious
437a413ca3 chore(ui): typegen 2024-03-14 10:53:57 +11:00
psychedelicious
4492bedd19 tidy(nodes): use ModelIdentifierField for model metadata
Until recently, this had a different shape than the ModelMetadataField. They are now the same, so we can re-use the ModelIdentifierField.
2024-03-14 10:53:57 +11:00
psychedelicious
db12ce95a8 fix(ui): invalid collect node error w/ control adapters
The graph builders used awaited functions within `Array.prototype.forEach` loops. This doesn't do what you'd think. This caused graphs to be enqueued before they were fully constructed.

 Changed to `for..of` loops to fix this.
2024-03-14 10:53:57 +11:00
psychedelicious
ee3a1a95ef fix(ui): control adapters require control images
There wasn't enough validation of control adapters during graph building. It would be possible for a graph to be built with empty collect node, causing an error. Addressed with an extra check.

This should never happen in practice, because the invoke button should be disabled if an invalid CA is active.
2024-03-14 10:53:57 +11:00
psychedelicious
4bb5aba70e feat(ui): only fetch TIs on first load, add comment 2024-03-14 07:38:09 +11:00
Mary Hipp
cd55c23713 initiate TI model query when socket connects so user doesnt have to wait when opening prompt trigger phrases 2024-03-14 07:38:09 +11:00
Mary Hipp
1d2743af1b remove log 2024-03-14 07:25:48 +11:00
Mary Hipp
99d2099ccd add key for controladapter CustomSelect too 2024-03-14 07:25:48 +11:00
Mary Hipp
b64a693f16 try adding a key to force rerender when items load 2024-03-14 07:25:48 +11:00
blessedcoolant
9d523a3094
chore: cleanup DepthAnything code (#5945)
## What type of PR is this? (check all applicable)

- [x] Optimization

## Description

Was merged into next but never carried over to main. So cleaning up
again.
2024-03-13 20:46:54 +05:30
blessedcoolant
af660163ca chore: cleanup DepthAnything code 2024-03-13 20:35:52 +05:30
psychedelicious
7e4b462fca docs: OVERVIEW.md typo 2024-03-13 22:43:20 +11:00
psychedelicious
4468dd6948 docs: update OVERVIEW.md
Update pkg scripts.
2024-03-13 22:43:20 +11:00
psychedelicious
4f39e248dd docs: update OVERVIEW.md
Fix links
2024-03-13 22:43:20 +11:00
psychedelicious
44b3e5d43f docs: update INVOCATION_API.md
Add blurb about `WithMetadata` and `WithBoard` mixins.
2024-03-13 22:43:20 +11:00
psychedelicious
8894a9e48a docs: update WORKFLOWS.md 2024-03-13 22:43:20 +11:00
psychedelicious
c73f58e486 docs: move frontend docs to mkdocs 2024-03-13 22:43:20 +11:00
psychedelicious
614fece147 chore(ui): prettier 2024-03-13 21:02:29 +11:00
psychedelicious
8ef8082d65 feat(ui): style add model panel 2024-03-13 21:02:29 +11:00
psychedelicious
d93d4afbb7 feat(ui): style HF scan tab 2024-03-13 21:02:29 +11:00
psychedelicious
01207a2fa5 fix(mm): config.json to indicates diffusers model 2024-03-13 21:02:29 +11:00
Jennifer Player
d0800c4888 ui consistency, moved is_diffusers logic to backend, extended HuggingFaceMetadata, removed logic from service 2024-03-13 21:02:29 +11:00
Jennifer Player
2a300ecada updated add model copy, added search to hugging face results 2024-03-13 21:02:29 +11:00
Jennifer Player
90340a39c7 clean up python errors 2024-03-13 21:02:29 +11:00
Jennifer Player
ee77abb4fe updated simple install button to match other tabs 2024-03-13 21:02:29 +11:00
Jennifer Player
004bca5c42 updated endpoint types 2024-03-13 21:02:29 +11:00
Jennifer Player
5ad048a161 fixed error handling 2024-03-13 21:02:29 +11:00
Jennifer Player
6369ccd05e added placeholders, updated some copy 2024-03-13 21:02:29 +11:00
Jennifer Player
3a5314f1ca install model if diffusers or single file, cleaned up backend logic to not mess with existing model install 2024-03-13 21:02:29 +11:00
Jennifer Player
4c0896e436 removed log 2024-03-13 21:02:29 +11:00
Jennifer Player
f7cd3cf1f4 added hf models import tab and route for getting available hf models 2024-03-13 21:02:29 +11:00
psychedelicious
efea1a8a7d ci: add always_run input to checks & tests, use this on release workflow
This bypasses the `changed-files` check, and forces the checks to run. The release workflow sets this flag to ensure that the checks and tests are always run for a release.
2024-03-13 15:32:42 +11:00
Mary Hipp
d0d695c020 disable trigger phrase form if empty 2024-03-12 21:08:15 -04:00
Jennifer Player
2a648da557 updated model manager to display when import item is cancelled 2024-03-13 09:18:05 +11:00
blessedcoolant
54f1a1f952
Update l2i invoke and seamless to support AutoencoderTiny, remove att… (#5936)
…ention processors if no mid_block is detected

## What type of PR is this? (check all applicable)

- [ ] Refactor
- [ ] Feature
- [x] Bug Fix
- [ ] Optimization
- [ ] Documentation Update
- [ ] Community Node Submission


## Have you discussed this change with the InvokeAI team?
- [x] Yes
- [ ] No, because:

      
## Have you updated all relevant documentation?
- [ ] Yes
- [x] No


## Description
L2i throws an assertion error when run with `madebyollin/taesdxl` due to
it requiring a different class in diffusers to load it. This is a small
PR to update seamless and l2i to accept AutoencoderTiny models and not
throw exceptions while processing them.

## QA Instructions, Screenshots, Recordings

<img width="445" alt="Screenshot 2024-03-12 at 12 04 29 PM"
src="https://github.com/invoke-ai/InvokeAI/assets/58442074/34a17e44-d911-4fef-8fc1-71f7b688688c">
Run an sdxl pipeline using a vae that requires AutoencoderTiny and
validate that the image successfully encodes and decodes.

## Merge Plan

This PR can be merged when approved
2024-03-12 21:52:32 +05:30
Brandon Rising
8d2a4db902 Found another instance of expecting a mid_block on the decoder in a vae 2024-03-12 12:11:38 -04:00
Brandon Rising
7b393656de Update l2i invoke and seamless to support AutoencoderTiny, remove attention processors if no mid_block is detected 2024-03-12 12:00:24 -04:00
psychedelicious
43948e0758 feat(ui): add setting for always show image size badge 2024-03-12 18:52:23 +11:00
psychedelicious
cc03fcbcb6 style(ui): tweak image dimension badge overlay styles 2024-03-12 18:52:23 +11:00
Rohinish
d1e445fa49 fix(ui): changed to theme tokens 2024-03-12 18:52:23 +11:00
Rohinish
adba8489f2 fix(ui): made changes to avoid overlapping 2024-03-12 18:52:23 +11:00
Rohinish
d919022ba5 fix(ui): fixed requested changes and made the badge display on hover 2024-03-12 18:52:23 +11:00
Rohinish
e076898798 fix(ui): logic to remove badge for small image size 2024-03-12 18:52:23 +11:00
Rohinish
9f19b766a4 feat(ui): Add image size badge to gallery images 2024-03-12 18:52:23 +11:00
psychedelicious
4688623711 ci: add missing permission to release workflow 2024-03-12 10:16:38 +11:00
Brandon Rising
be951da99d {release} 4.0.0rc1 2024-03-12 10:05:03 +11:00
Ryan Dick
9ee2e7ff25 Do not override log_memory_usage when debug logs are enabled. The speed cost of log_memory_usage=True is large. It is common to want debug log without enabling log_memory_usage. 2024-03-12 09:48:50 +11:00
Brandon Rising
149ff758b9 Run ruff 2024-03-11 15:53:00 -04:00
Brandon Rising
65d415d5aa Remove redundant with_suffix call 2024-03-11 15:53:00 -04:00
Brandon Rising
c74c1927ec Gracefully error without deleting invokeai.yaml 2024-03-11 15:53:00 -04:00