1181 Commits

Author SHA1 Message Date
blessedcoolant
9a383e456d Codesplit SCHEDULER_MAP for reusage 2023-05-12 00:40:03 +12:00
blessedcoolant
3ffff023b2 Add missing key to scheduler_map
It was breaking coz the sampler was not being reset. So needs a key on each. Will simplify this later.
2023-05-12 00:08:50 +12:00
blessedcoolant
d1029138d2 Default to DDIM if scheduler is missing 2023-05-11 22:54:35 +12:00
blessedcoolant
06b5800d28 Add UniPC Scheduler 2023-05-11 22:43:18 +12:00
Eugene Brodsky
3baa230077
Merge branch 'main' into lstein/bugfix/compel 2023-05-11 00:50:45 -04:00
Eugene
9e594f9018 pad conditioning tensors to same length
fixes crash when prompt length is greater than 75 tokens
2023-05-11 00:34:15 -04:00
Lincoln Stein
8ad8c5c67a resolve conflicts with main 2023-05-11 00:19:20 -04:00
Lincoln Stein
4627910c5d added a wrapper model_manager_service and model events 2023-05-11 00:09:19 -04:00
psychedelicious
49db6f4fac fix(nodes): fix trivial typing issues 2023-05-11 11:55:51 +10:00
psychedelicious
206e6b1730 feat(nodes): wip inpaint node 2023-05-11 11:55:51 +10:00
Lincoln Stein
5d5157fc65 make conditioning.py work with compel 1.1.5 2023-05-10 18:08:33 -04:00
Lincoln Stein
99c692f397 check that model name matches format 2023-05-09 23:46:59 -04:00
Lincoln Stein
3d85e769ce clean up ckpt handling
- remove legacy ckpt loading code from model_cache
- added placeholders for lora and textual inversion model loading
2023-05-09 22:44:58 -04:00
Lincoln Stein
9cb962cad7 ckpt model conversion now done in ModelCache 2023-05-08 23:39:44 -04:00
Lincoln Stein
a108155544 added StALKeR779's great model size calculating routine 2023-05-08 21:47:03 -04:00
Lincoln Stein
c15b49c805 implement StALKeR7779 requested API for fetching submodels 2023-05-07 23:18:17 -04:00
Lincoln Stein
fd63e36822 optimize subfolder so that it returns submodel if parent is in RAM 2023-05-07 21:39:11 -04:00
Lincoln Stein
4649920074 adjust t2i to work with new model structure 2023-05-07 19:06:49 -04:00
Lincoln Stein
667171ed90 cap model cache size using bytes, not # models 2023-05-07 18:07:28 -04:00
Lincoln Stein
647ffb2a0f defined abstract baseclass for model manager service 2023-05-06 22:41:19 -04:00
Lincoln Stein
afd2e32092
Merge branch 'main' into lstein/global-configuration 2023-05-06 21:20:25 -04:00
Lincoln Stein
05a27bda5e generalize model loading support, include loras/embeds 2023-05-06 15:58:44 -04:00
Lincoln Stein
e0214a32bc mostly ported to new manager API; needs testing 2023-05-06 00:44:12 -04:00
Lincoln Stein
af8c7c7d29 model manager rewritten to use model_cache; API changed! 2023-05-05 19:32:28 -04:00
Lincoln Stein
a4e36bc02a when model is forcibly moved into RAM update loaded_models set 2023-05-04 23:28:03 -04:00
Lincoln Stein
68bc0112fa implement lazy GPU offloading and ref counting 2023-05-04 23:15:32 -04:00
Lincoln Stein
d866dcb3d2 close #3343 2023-05-04 20:30:59 -04:00
Lincoln Stein
e4196bbe5b adjust non-app modules to use new config system 2023-05-04 00:43:51 -04:00
Lincoln Stein
15ffb53e59 remove globals, args, generate and the legacy CLI 2023-05-03 23:36:51 -04:00
Lincoln Stein
90054ddf0d use InvokeAISettings for app-wide configuration 2023-05-03 22:30:30 -04:00
Lincoln Stein
a273bdbdc1
Merge branch 'main' into lstein/new-model-manager 2023-05-03 18:09:29 -04:00
Lincoln Stein
e1fed52c66 work on model cache and its regression test finished 2023-05-03 12:38:18 -04:00
Lincoln Stein
bb959448c1 implement hashing for local & remote models 2023-05-02 16:52:27 -04:00
Lincoln Stein
2e2abf6ea6 caching of subparts working 2023-05-01 22:57:30 -04:00
Lincoln Stein
974841926d logger is a interchangeable service 2023-04-29 10:48:50 -04:00
Lincoln Stein
8db20e0d95 rename log to logger throughout 2023-04-29 09:43:40 -04:00
Lincoln Stein
6b79e2b407 Merge branch 'main' into enhance/invokeai-logs
- resolve conflicts
- remove unused code identified by pyflakes
2023-04-28 10:09:46 -04:00
Lincoln Stein
956ad6bcf5 add redesigned model cache for diffusers & transformers 2023-04-28 00:41:52 -04:00
Lincoln Stein
31a904b903
Merge branch 'main' into bugfix/prevent-cli-crash 2023-04-25 03:28:45 +01:00
Lincoln Stein
4fa5c963a1
Merge branch 'main' into bugfix/prevent-cli-crash 2023-04-25 03:10:51 +01:00
Lincoln Stein
b164330e3c replaced remaining print statements with log.*() 2023-04-18 20:49:00 -04:00
Lincoln Stein
69433c9f68
Merge branch 'main' into lstein/enhance/diffusers-0.15 2023-04-18 19:21:53 -04:00
Lincoln Stein
bd8ffd36bf bump to diffusers 0.15.1, remove dangling module 2023-04-18 19:20:38 -04:00
Tim Cabbage
f6cdff2c5b
[bug] #3218 HuggingFace API off when --no-internet set
https://github.com/invoke-ai/InvokeAI/issues/3218

Huggingface API will not be queried if --no-internet flag is set
2023-04-17 16:53:31 +02:00
Lincoln Stein
aab262d991
Merge branch 'main' into bugfix/prevent-cli-crash 2023-04-14 20:12:38 -04:00
Lincoln Stein
47b9910b48 update to diffusers 0.15 and fix code for name changes
- This is a port of #3184 to the main branch
2023-04-14 15:35:03 -04:00
Lincoln Stein
0b0e6fe448 convert remainder of print() to log.info() 2023-04-14 15:15:14 -04:00
Lincoln Stein
c132dbdefa change "ialog" to "log" 2023-04-11 18:48:20 -04:00
Lincoln Stein
f3081e7013 add module-level getLogger() method 2023-04-11 12:23:13 -04:00
Lincoln Stein
f904f14f9e add missing module-level methods 2023-04-11 11:10:43 -04:00