blessedcoolant
337399ff7c
fix: Add API tags for Scanned Models
2023-07-18 11:57:45 +12:00
Lincoln Stein
e03e43281b
Merge branch 'mm-ui' of github.com:blessedcoolant/InvokeAI into mm-ui
2023-07-17 10:00:36 -04:00
Lincoln Stein
08854b6d68
keep model path consistent with model manager key in model update api
2023-07-17 10:00:28 -04:00
blessedcoolant
0712294c17
fix: Model Manager light mode color fixes
2023-07-18 00:29:20 +12:00
Lincoln Stein
0ea8d3c30c
prevent crash on rename operation on models in models directory
2023-07-17 07:50:06 -04:00
Lincoln Stein
84a13ff8e1
Merge branch 'mm-ui' of github.com:blessedcoolant/InvokeAI into mm-ui
2023-07-17 07:29:35 -04:00
Lincoln Stein
3fba262c94
expose paths as absolute to web api
2023-07-17 07:29:26 -04:00
Lincoln Stein
107ca6bf47
expose model paths as absolute to web models API
2023-07-17 07:26:05 -04:00
blessedcoolant
cbfd1d1b27
Merge branch 'main' into feat/standalone_diffusers_ti
2023-07-17 22:01:52 +12:00
skunkworxdark
f767bf2330
use FileNotFoundError insted of "File path not found"
2023-07-17 05:49:09 -04:00
skunkworxdark
b1008af696
apply changes as suggested @psychedelicious in PR comments.
...
- filename -> file_path
- pre and post prompt changed to optional
- clearer pre and post prompt descriptions
- handle pre and post prompt passed as None
- max_prompts defaults to 1 isted of 0 to avoid accidentally processing large prompt files with it set to 0 when adding a new node.
2023-07-17 05:49:09 -04:00
skunkworxdark
956011066d
Added class PromptsFromFileInvocation to prompt.py. A new PromptFromFile Custom node that reads prompts from a file one line per prompt and outputs them as a prompt collection. With inputs for filename, pre_prompt, post_prompt, start line number, and max_prompts
2023-07-17 05:49:09 -04:00
blessedcoolant
cfdaa30d44
feat: Scan models add to differentiate between ckpt and diffusers
2023-07-17 19:40:08 +12:00
blessedcoolant
f398fe4136
fix: Merge models not respecting save directory
2023-07-17 17:59:05 +12:00
blessedcoolant
41e7b008fb
feat: Add search to Scanned Models
2023-07-17 17:32:34 +12:00
blessedcoolant
98e6a56714
fix: Model Manager jank / bugs / refinement
2023-07-17 17:09:41 +12:00
blessedcoolant
cbd5be73d2
feat: Add Scan Models Advanced Add
2023-07-17 16:44:01 +12:00
blessedcoolant
38e6e3b36b
feat: Add Quick Add To Scan Model
2023-07-17 16:07:38 +12:00
blessedcoolant
540f40c293
fix: Better file and component naming for Add Models
2023-07-17 13:58:11 +12:00
blessedcoolant
641b90cc3f
chore: regen types
2023-07-17 13:50:35 +12:00
blessedcoolant
aebd595607
Merge branch 'main' into mm-ui
2023-07-17 13:49:25 +12:00
Lincoln Stein
ed88e72412
correct cannot assign to field 'unconditioned_embeddings' error
2023-07-16 21:06:40 -04:00
Sergey Borisov
6aefd8600a
Fix error with long prompts when controlnet used
2023-07-16 21:06:40 -04:00
Kent Keirsey
675a92401c
Merge branch 'main' into lstein/default-model-install
2023-07-16 19:32:59 -04:00
Sergey Borisov
b61c83e836
Allow bin extension to detect diffusers-ti provided as file
2023-07-17 00:32:17 +03:00
Lincoln Stein
2bc3e36bc0
add missing exception name
2023-07-16 16:14:28 -04:00
Lincoln Stein
6fbb5ce780
add renaming capabilities to model update API route
2023-07-16 14:17:05 -04:00
blessedcoolant
dabd2bf301
fix: Readd model name to edit forms
...
Will be needed when we implement changing name and base model type.
2023-07-16 16:15:53 +12:00
blessedcoolant
92029e69c6
feat: Update Checkpoint Model Edit to use config picker
2023-07-16 15:48:44 +12:00
blessedcoolant
5351171d0e
cleanup: Scan Models component (to begin anew)
2023-07-16 15:29:25 +12:00
blessedcoolant
5b047baeb0
fix: Mantine Required icon being on new line
2023-07-16 15:29:01 +12:00
blessedcoolant
d93d42af4a
feat: Add Manual Checkpoint / Safetensor Models
2023-07-16 15:21:49 +12:00
blessedcoolant
421fcb761b
feat: Manual Add Diffusers Model
2023-07-16 14:20:27 +12:00
blessedcoolant
2e0370d845
feat: Extract BaseModel and ModelVariant Select's
...
For reusability
2023-07-16 14:07:26 +12:00
psychedelicious
5d59dd4b97
feat(nodes): use correctly-typed configuration service in upscale node
2023-07-16 10:54:52 +10:00
psychedelicious
48a031dbaf
fix(nodes): fix typing of configuration service
2023-07-16 10:52:18 +10:00
Lincoln Stein
7fa394912d
Merge branch 'main' into lstein/default-model-install
2023-07-15 18:26:35 -04:00
Lincoln Stein
373beefd13
remove restoration option from invokeai.yaml
2023-07-15 18:26:19 -04:00
Lincoln Stein
6b0a158ffa
Merge branch 'main' into lstein/default-model-install
2023-07-15 18:23:34 -04:00
Lincoln Stein
c90345d6a3
deprecate the face restoration option
2023-07-15 18:23:32 -04:00
Lincoln Stein
70b12d9693
Merge branch 'main' into update-textual-inversion-training
2023-07-15 18:16:20 -04:00
Lincoln Stein
9faffa2245
revert inadvertent breaking change to config causing test failures (override)
2023-07-15 18:15:59 -04:00
Lincoln Stein
f66ead0819
Merge branch 'main' into update-textual-inversion-training
2023-07-15 17:44:45 -04:00
Lincoln Stein
6073cb8020
add documentation on the configuration system
2023-07-15 16:14:47 -04:00
psychedelicious
c7b547ea3e
feat(nodes): remove references to restoration services
...
- remove restoration services
- remove the restore faces nodes
- update tests
2023-07-16 01:12:39 +10:00
psychedelicious
8a1b9d1001
chore(ui): regen types
2023-07-16 01:06:57 +10:00
psychedelicious
74ca87ac9e
feat(nodes): add realesrgan node
2023-07-16 01:06:50 +10:00
Kent Keirsey
77b0129b4c
Merge branch 'main' into lstein/migrate-fix
2023-07-15 10:37:56 -04:00
Lincoln Stein
e01706f5f5
add fp16 support to controlnet models
2023-07-15 10:37:11 -04:00
Lincoln Stein
f504c7ebbd
Merge branch 'main' into lstein/migrate-fix
2023-07-15 10:13:44 -04:00