StAlKeR7779
889b77d3d6
Merge branch 'main' into save_vram
2023-07-18 16:55:48 +03:00
Sergey Borisov
fbbc4b3f69
Fixes
2023-07-18 16:51:16 +03:00
Sergey Borisov
bc11296a5e
Disable lazy offloading on disabled vram cache, move resulted tensors to cpu(to not stack vram tensors in cache), fix - text encoder not freed(detach)
2023-07-18 16:20:25 +03:00
psychedelicious
c19d48abd0
fix(nodes): fix inpaint cond logic for new compel version
...
thanks @StAlKeR7779
2023-07-18 22:39:34 +10:00
Lincoln Stein
9c3c393b84
merge with main
2023-07-18 07:00:55 -04:00
Lincoln Stein
c955c13b6f
Merge branch 'sdxl-support' of github.com:invoke-ai/InvokeAI into sdxl-support
2023-07-17 23:49:48 -04:00
Lincoln Stein
ef31837167
fix caption on sdxl raw prompt
2023-07-17 23:49:23 -04:00
blessedcoolant
b08ad28daa
fix: typo in logger statement (import_model)
2023-07-18 15:17:52 +12:00
Lincoln Stein
1353bf98b3
add specific exception for model probe failures
2023-07-17 23:08:39 -04:00
Lincoln Stein
af1c1ab51f
importing an unrecognized model now gives "Unsupported Media Type" error
2023-07-17 22:33:05 -04:00
blessedcoolant
13da881953
Merge branch 'main' into sdxl-support
2023-07-18 13:34:07 +12:00
blessedcoolant
ec3c15ead0
Merge branch 'main' into mm-ui
2023-07-18 12:58:57 +12:00
Sergey Borisov
ada9b06e48
Implement compel prompt nodes for sdxl
2023-07-18 01:49:45 +03:00
Lincoln Stein
6ae10798b0
Merge branch 'main' into feat/model-events
2023-07-17 17:15:12 -04:00
Lincoln Stein
08854b6d68
keep model path consistent with model manager key in model update api
2023-07-17 10:00:28 -04:00
skunkworxdark
f767bf2330
use FileNotFoundError insted of "File path not found"
2023-07-17 05:49:09 -04:00
skunkworxdark
b1008af696
apply changes as suggested @psychedelicious in PR comments.
...
- filename -> file_path
- pre and post prompt changed to optional
- clearer pre and post prompt descriptions
- handle pre and post prompt passed as None
- max_prompts defaults to 1 isted of 0 to avoid accidentally processing large prompt files with it set to 0 when adding a new node.
2023-07-17 05:49:09 -04:00
skunkworxdark
956011066d
Added class PromptsFromFileInvocation to prompt.py. A new PromptFromFile Custom node that reads prompts from a file one line per prompt and outputs them as a prompt collection. With inputs for filename, pre_prompt, post_prompt, start line number, and max_prompts
2023-07-17 05:49:09 -04:00
blessedcoolant
e039771d07
fix: Incorrect type on SDXL Model Loader
2023-07-17 21:47:41 +12:00
psychedelicious
3e2a948007
Merge branch 'main' into feat/model-events
2023-07-17 17:36:20 +10:00
psychedelicious
ba12849685
fix(nodes): fix some model load events not emitting
...
Missed adding the `context` arg to them initially
2023-07-17 17:16:55 +10:00
Sergey Borisov
6aefd8600a
Fix error with long prompts when controlnet used
2023-07-16 21:06:40 -04:00
Kent Keirsey
675a92401c
Merge branch 'main' into lstein/default-model-install
2023-07-16 19:32:59 -04:00
Lincoln Stein
6fbb5ce780
add renaming capabilities to model update API route
2023-07-16 14:17:05 -04:00
Lincoln Stein
cad3f96831
add model input to refiner
2023-07-16 12:38:04 -04:00
Lincoln Stein
6534288b75
refiner only has clip2 not clip
2023-07-16 12:36:38 -04:00
Lincoln Stein
0a2964d8c0
add differentiated sdxl and sdxl_refiner model loaders
2023-07-16 12:17:56 -04:00
Lincoln Stein
5206ddf9b2
truncate long prompts to avoid a crash with controlnet
2023-07-15 23:49:25 -04:00
Sergey Borisov
fe78a08e37
Fix sd1/2 models conditionings
2023-07-16 06:24:24 +03:00
Sergey Borisov
c9c2229917
Separate prompt to sdxl and sdxl-refiner, add denoising start-end fields, add l2l node(supports both sdxl and sdxl-refiner), add fp32 to vae encode
2023-07-16 06:00:37 +03:00
psychedelicious
5d59dd4b97
feat(nodes): use correctly-typed configuration service in upscale node
2023-07-16 10:54:52 +10:00
psychedelicious
48a031dbaf
fix(nodes): fix typing of configuration service
2023-07-16 10:52:18 +10:00
Lincoln Stein
ccbfa5d862
resolve conflicts
2023-07-15 19:47:50 -04:00
Lincoln Stein
373beefd13
remove restoration option from invokeai.yaml
2023-07-15 18:26:19 -04:00
Lincoln Stein
6b0a158ffa
Merge branch 'main' into lstein/default-model-install
2023-07-15 18:23:34 -04:00
Lincoln Stein
c90345d6a3
deprecate the face restoration option
2023-07-15 18:23:32 -04:00
Lincoln Stein
9faffa2245
revert inadvertent breaking change to config causing test failures (override)
2023-07-15 18:15:59 -04:00
Lincoln Stein
6073cb8020
add documentation on the configuration system
2023-07-15 16:14:47 -04:00
psychedelicious
7b6159f8d6
feat(nodes): emit model loading events
...
- remove dependency on having access to a `node` during emits, would need a bit of additional args passed through the system and I don't think its necessary at this point. this also allowed us to drop an extraneous fetching/parsing of the session from db.
- provide the invocation context to all `get_model()` calls, so the events are able to be emitted
- test all model loading events in the app and confirm socket events are received
2023-07-16 02:12:01 +10:00
psychedelicious
c7b547ea3e
feat(nodes): remove references to restoration services
...
- remove restoration services
- remove the restore faces nodes
- update tests
2023-07-16 01:12:39 +10:00
psychedelicious
74ca87ac9e
feat(nodes): add realesrgan node
2023-07-16 01:06:50 +10:00
psychedelicious
d270f21c85
feat(nodes): valid controlnet weights are -1 to 2
2023-07-15 19:56:44 +10:00
psychedelicious
ae72f372be
fix(nodes): do not use hardcoded controlnet model
2023-07-15 19:56:44 +10:00
psychedelicious
29b2e59e65
fix(nodes): fix ref to ctx mgr service, missing import
2023-07-15 19:56:44 +10:00
psychedelicious
82fa39b531
feat(nodes): add controlnet nodes type hint
2023-07-15 19:56:44 +10:00
psychedelicious
788dcbde70
fix(nodes): add missing import
2023-07-15 19:56:44 +10:00
Sergey Borisov
6ab9a5e108
Draft
2023-07-15 19:56:44 +10:00
blessedcoolant
808b2de709
Merge branch 'main' into lstein/model-manager-route-enhancements
2023-07-15 16:56:54 +12:00
Lincoln Stein
2faa7cee37
add rename_model route
2023-07-14 23:03:18 -04:00
Sergey Borisov
7093e5d033
Pad conditionings using zeros and encoder_attention_mask
2023-07-15 00:52:54 +03:00