Commit Graph

5795 Commits

Author SHA1 Message Date
Lincoln Stein
a7cbcae176 expose max_cache_size to invokeai-configure interface 2023-07-05 20:59:57 -04:00
Lincoln Stein
0a6dccd607 expose max_cache_size to invokeai-configure interface 2023-07-05 20:59:14 -04:00
Lincoln Stein
43c51ff157
Merge branch 'main' into lstein/more-model-loading-fixes 2023-07-05 20:48:15 -04:00
Lincoln Stein
bf25818d76 rebuild front end; bump version 2023-07-05 20:33:28 -04:00
Lincoln Stein
cfa3b2419c partial implementation of merge 2023-07-05 20:25:47 -04:00
Lincoln Stein
d4550b3059 clean up lint errors in lora.py 2023-07-05 19:18:25 -04:00
Lincoln Stein
83d3a043da merge latest changes from main 2023-07-05 19:15:53 -04:00
gogurtenjoyer
169ff6368b
Update mps_fixes.py - additional torch op for nodes
This fixes scaling in the nodes UI.
2023-07-05 17:47:23 -04:00
Lincoln Stein
71dad6d404
Merge branch 'main' into ti-ui 2023-07-05 16:57:31 -04:00
Lincoln Stein
c21bd806f0 default LoRA weight to 0.75 2023-07-05 16:54:23 -04:00
Kent Keirsey
007d125e40
Update README.md 2023-07-05 16:53:37 -04:00
Kent Keirsey
716d154957
Update LICENSE 2023-07-05 16:41:28 -04:00
Lincoln Stein
685a47cc7d fix crash during lora application 2023-07-05 16:40:47 -04:00
Lincoln Stein
52498cc0b9
Put tokenizer and text encoder in same clip-vit-large-patch14 (#3662)
This PR fixes the migrate script so that it uses the same directory for
both the tokenizer and text encoder CLIP models. This will fix a crash
that occurred during checkpoint->diffusers conversions

This PR also removes the check for an existing models directory in the
target root directory when `invokeai-migrate3` is run.
2023-07-05 16:29:33 -04:00
Lincoln Stein
cb947bcbf0
Merge branch 'main' into lstein/fix-migrate3-textencoder 2023-07-05 16:23:00 -04:00
Lincoln Stein
bbfb5bb1d4
Remove hardcoded cuda device in model manager init (#3624)
There was a line in model_manager.py in which the GPU device was
hardcoded to "cuda". This has now been removed.
2023-07-05 16:22:45 -04:00
Lincoln Stein
f8bbec8572 Recognize and load diffusers-style LoRAs (.bin)
Prevent double-reporting of autoimported models
- closes #3636

Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:21:23 -04:00
Lincoln Stein
863336acbb Recognize and load diffusers-style LoRAs (.bin)
Prevent double-reporting of autoimported models
- closes #3636

Allow autoimport of diffusers-style LoRA models
- closes #3637
2023-07-05 16:19:16 -04:00
Lincoln Stein
90ae8ce26a prevent model install crash "torch needs to be restarted with spawn" 2023-07-05 16:18:20 -04:00
Lincoln Stein
ad5d90aca8 prevent model install crash "torch needs to be restarted with spawn" 2023-07-05 15:38:07 -04:00
Lincoln Stein
5b6dd47b9f add API for model convert 2023-07-05 15:13:21 -04:00
Lincoln Stein
5027d0a603 accept @psychedelicious suggestions above 2023-07-05 14:50:57 -04:00
Lincoln Stein
9f9ce08e44
Merge branch 'main' into lstein/remove-hardcoded-cuda-device 2023-07-05 13:38:33 -04:00
Lincoln Stein
17c5568661
build: remove web ui dist from gitignore (#3650)
The web UI should manage its own .gitignore

I think would explain why certain files were not making it into the pypi
release
2023-07-05 13:36:16 -04:00
Lincoln Stein
94740e440d
Merge branch 'main' into build/gitignore 2023-07-05 13:35:54 -04:00
Lincoln Stein
021e1eca8e
Merge branch 'main' into mps-fp16-fixes 2023-07-05 13:19:52 -04:00
Lincoln Stein
5fe722900d allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory 2023-07-05 13:15:08 -04:00
Lincoln Stein
cf173b522b allow clip-vit-large-patch14 text encoder to coexist with tokenizer in same directory 2023-07-05 13:14:41 -04:00
Mary Hipp Rogers
ea81ce9489
close modal when user clicks cancel (#3656)
* close modal when user clicks cancel

* close modal when delete image context cleared

---------

Co-authored-by: Mary Hipp <maryhipp@Marys-MacBook-Air.local>
2023-07-05 17:12:27 +00:00
blessedcoolant
8283b80b58
Fix ckpt scanning on conversion (#3653) 2023-07-06 05:09:13 +12:00
blessedcoolant
9e2d63ef97
Merge branch 'main' into fix/ckpt_convert_scan 2023-07-06 05:01:34 +12:00
blessedcoolant
dd946790ec
Fix loading diffusers ti (#3661) 2023-07-06 05:01:11 +12:00
Sergey Borisov
0ac9dca926 Fix loading diffusers ti 2023-07-05 19:46:00 +03:00
psychedelicious
acd3b1a512 build: remove web ui dist from gitignore
The web UI should manage its own .gitignore
2023-07-06 00:39:36 +10:00
Lincoln Stein
bd82c4ace0 model installer confirms deletion of models 2023-07-05 09:57:23 -04:00
blessedcoolant
e4d92da3a9 fix: Make space for icons in prompt box 2023-07-06 01:48:50 +12:00
blessedcoolant
9204b72383 feat: Make Embedding Picker a mini toggle 2023-07-06 01:45:00 +12:00
Lincoln Stein
9edf78dd2e merge with main 2023-07-05 09:12:54 -04:00
Lincoln Stein
5d31703224 Merge branch 'release/invokeai-3-0-alpha' of github.com:invoke-ai/InvokeAI into release/invokeai-3-0-alpha 2023-07-05 09:05:59 -04:00
Lincoln Stein
6112197edf convert implemented; need router 2023-07-05 09:05:05 -04:00
Lincoln Stein
44d5bef7e4 bump version number 2023-07-05 09:02:35 -04:00
blessedcoolant
a556bf45bb Merge branch 'main' into ti-ui 2023-07-05 23:42:48 +12:00
blessedcoolant
818616a0c5
fix(ui): fix prompt resize & style resizer (#3652) 2023-07-05 23:42:23 +12:00
blessedcoolant
8c9266359d feat: Add Embedding Select To Linear UI 2023-07-05 23:41:15 +12:00
blessedcoolant
3b324a7d0a
Merge branch 'main' into fix/ui/fix-prompt-resize 2023-07-05 23:40:47 +12:00
blessedcoolant
c8cb43ff2d
Fix clip path in migrate script (#3651)
Update path for clip model according to path used in ckpt conversion and
invokeai-configure
2023-07-05 23:38:45 +12:00
gogurtenjoyer
ba7345deb4
Merge branch 'main' into mps-fp16-fixes 2023-07-05 07:38:41 -04:00
Sergey Borisov
ee042ab76d Fix ckpt scanning on conversion 2023-07-05 14:18:30 +03:00
psychedelicious
596c791844 fix(ui): fix prompt resize & style resizer 2023-07-05 21:02:31 +10:00
blessedcoolant
780e77d2ae
Merge branch 'main' into fix/clip_path 2023-07-05 22:45:52 +12:00