Commit Graph

121 Commits

Author SHA1 Message Date
Sergey Borisov
2539e26c18 Apply denoising_start/end, add torch-sdp to memory effictiend attention func 2023-08-07 19:57:11 +03:00
Sergey Borisov
b0738b7f70 Fixes, zero tensor for empty negative prompt, remove raw prompt node 2023-08-07 18:37:06 +03:00
Sergey Borisov
9aaf67c5b4 wip 2023-08-06 05:05:25 +03:00
Lincoln Stein
6ad565d84c folded in changes from 4099 2023-08-04 18:24:47 -04:00
Sergey Borisov
04229082d6 Provide ti name from model manager, not from ti itself 2023-08-04 18:24:47 -04:00
Sergey Borisov
1ac14a1e43 add sdxl lora support 2023-08-04 11:44:56 -04:00
Lincoln Stein
e7d9e552a7
Merge branch 'main' into feat_compel_and 2023-08-01 07:20:25 -04:00
Damian Stewart
d2c55dc011 enable .and() syntax and long prompts 2023-07-30 14:20:59 +02:00
Brandon Rising
2b7b3dd4ba Run python black 2023-07-28 09:46:44 -04:00
Martin Kristiansen
218b6d0546 Apply black 2023-07-27 10:54:01 -04:00
Brandon Rising
c16da75ac7 Merge branch 'main' into feat/onnx 2023-07-26 10:42:31 -04:00
psychedelicious
840205496a feat(nodes): fix model load events on sdxl nodes
they need the `context` to be provided to emit socket events
2023-07-26 14:46:38 +10:00
Brandon Rising
ee7b36cea5 Merge branch 'main' into onnx-testing 2023-07-18 22:56:41 -04:00
Sergey Borisov
3240f98f4e Rename clip1 to clip 2023-07-18 18:58:17 +03:00
blessedcoolant
3f1d5000c0 Merge branch 'main' into nodes-stuff 2023-07-19 02:37:50 +12:00
blessedcoolant
0c18c5d603 feat: Add titles and tags to all Nodes 2023-07-19 02:26:45 +12:00
StAlKeR7779
889b77d3d6
Merge branch 'main' into save_vram 2023-07-18 16:55:48 +03:00
Sergey Borisov
bc11296a5e Disable lazy offloading on disabled vram cache, move resulted tensors to cpu(to not stack vram tensors in cache), fix - text encoder not freed(detach) 2023-07-18 16:20:25 +03:00
Lincoln Stein
9c3c393b84 merge with main 2023-07-18 07:00:55 -04:00
Lincoln Stein
ef31837167 fix caption on sdxl raw prompt 2023-07-17 23:49:23 -04:00
blessedcoolant
13da881953 Merge branch 'main' into sdxl-support 2023-07-18 13:34:07 +12:00
Sergey Borisov
ada9b06e48 Implement compel prompt nodes for sdxl 2023-07-18 01:49:45 +03:00
psychedelicious
ba12849685 fix(nodes): fix some model load events not emitting
Missed adding the `context` arg to them initially
2023-07-17 17:16:55 +10:00
Lincoln Stein
5206ddf9b2 truncate long prompts to avoid a crash with controlnet 2023-07-15 23:49:25 -04:00
Sergey Borisov
fe78a08e37 Fix sd1/2 models conditionings 2023-07-16 06:24:24 +03:00
Sergey Borisov
c9c2229917 Separate prompt to sdxl and sdxl-refiner, add denoising start-end fields, add l2l node(supports both sdxl and sdxl-refiner), add fp32 to vae encode 2023-07-16 06:00:37 +03:00
Lincoln Stein
ccbfa5d862 resolve conflicts 2023-07-15 19:47:50 -04:00
Sergey Borisov
7093e5d033 Pad conditionings using zeros and encoder_attention_mask 2023-07-15 00:52:54 +03:00
Brandon Rising
524888bf3b Merge branch 'main' into feat/onnx 2023-07-13 14:23:57 -04:00
Sergey Borisov
358ced6bab SDXL Prompt and t2l nodes draft, add fp32 to vae decode 2023-07-11 18:19:36 +03:00
Eugene Brodsky
97b2ec58e2
Merge branch 'main' into release/invokeai-3-0-alpha 2023-07-07 14:18:12 -04:00
Sergey Borisov
a9e77675a8 Move clip skip to separate node 2023-07-06 17:39:49 +03:00
Sergey Borisov
04b57c408f Add clip skip option to prompt node 2023-07-06 16:09:40 +03:00
Lincoln Stein
83d3a043da merge latest changes from main 2023-07-05 19:15:53 -04:00
Sergey Borisov
0ac9dca926 Fix loading diffusers ti 2023-07-05 19:46:00 +03:00
Lincoln Stein
9edf78dd2e merge with main 2023-07-05 09:12:54 -04:00
blessedcoolant
639d88afd6 revert: inference_mode to no_grad 2023-07-05 16:39:15 +12:00
blessedcoolant
c0501ed5c2 fix: Slow loading of Loras
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-07-05 12:47:34 +10:00
Lincoln Stein
4d2c7806fc quash memory leak when compel invocation called 2023-07-03 14:12:35 -04:00
Lincoln Stein
3937428563 Merge branch 'release/invokeai-3-0-alpha' of github.com:invoke-ai/InvokeAI into release/invokeai-3-0-alpha 2023-07-03 14:11:28 -04:00
Lincoln Stein
ed86d0b708 Union[foo, None]=>Optional[foo] 2023-07-03 12:17:45 -04:00
Lincoln Stein
aae60b6142 quash memory leak when compel invocation called 2023-07-03 10:08:10 -04:00
Lincoln Stein
b79740d61d back out torch.no_grad() 2023-07-02 23:03:24 -04:00
Lincoln Stein
fa8ccd2a94 add no_grad() to compel node invoke() method 2023-07-02 18:20:16 -04:00
Sergey Borisov
5cebf67ee4 Apply lora by patching lora instead of hooks 2023-06-26 03:57:33 +03:00
Sergey Borisov
7759b3f75a Small refactor 2023-06-21 04:24:25 +03:00
Sergey Borisov
738ba40f51 Fixes 2023-06-11 06:12:21 +03:00
Lincoln Stein
1f9e1eb964 merge with main 2023-06-06 22:18:41 -04:00
Damian Stewart
cdcfda164d enable long prompts, upgrade compel to enable .and() (concatenating prompts) 2023-06-04 15:30:54 +02:00
Sergey Borisov
b47786e846 First working TI draft 2023-05-31 02:12:27 +03:00
Sergey Borisov
79de9047b5 First working lora implementation 2023-05-30 01:11:00 +03:00
Lincoln Stein
5f8f51436a merge with main; fix conflicts 2023-05-25 22:40:45 -04:00
psychedelicious
d4aa79acd7 fix(nodes): use save instead of set
`set` is a python builtin
2023-05-24 11:30:47 -04:00
Lincoln Stein
bdf33f13b3 fix bad merge in compel 2023-05-18 18:08:45 -04:00
Lincoln Stein
d96175d127 resolve some undefined symbols in model_cache 2023-05-18 14:31:47 -04:00
Lincoln Stein
8adff96e29
Merge branch 'main' into lstein/global-configuration 2023-05-17 14:37:09 -04:00
Sergey Borisov
039fa73269 Change SDModelType enum to string, fixes(model unload negative locks count, scheduler load error, saftensors convert, wrong logic in del_model, wrong parse metadata in web) 2023-05-14 03:06:26 +03:00
Sergey Borisov
3b2a054f7a Add model loader node; unet, clip, vae fields; change compel node to clip field 2023-05-13 04:37:20 +03:00
Sergey Borisov
4492044d29 Redo compel node to separate model loading 2023-05-12 23:09:33 +03:00
Lincoln Stein
df5b968954 model manager now running as a service 2023-05-11 21:24:29 -04:00
Lincoln Stein
4627910c5d added a wrapper model_manager_service and model events 2023-05-11 00:09:19 -04:00
Lincoln Stein
aca4770481 fixed compel.py as requested 2023-05-10 21:40:44 -04:00
Lincoln Stein
42d938fda5 remove debugging statement 2023-05-06 23:54:11 -04:00
Lincoln Stein
25ce47c44f remove reference to globals in compel.py 2023-05-06 22:49:35 -04:00
StAlKeR7779
a80fe05e23 Rename compel node 2023-05-05 21:30:16 +03:00
StAlKeR7779
58d7833c5c Review changes 2023-05-05 21:09:29 +03:00
StAlKeR7779
5012f61599 Separate conditionings back to positive and negative 2023-05-05 15:47:51 +03:00
StAlKeR7779
7d221e2518 Combine conditioning to one field(better fits for multiple type conditioning like perp-neg) 2023-05-04 20:14:22 +03:00
StAlKeR7779
37916a22ad Use textual inversion manager from pipeline, remove extra conditioning info for uc 2023-04-25 12:53:13 +03:00
StAlKeR7779
8cb2fa8600 Restore log_tokenization check 2023-04-25 04:29:17 +03:00
StAlKeR7779
d99a08a441 Add compel node and conditioning field type 2023-04-25 03:48:44 +03:00