Commit Graph

312 Commits

Author SHA1 Message Date
Lincoln Stein
d37b08a7dd
Merge branch 'main' into release/make-web-dist-startable 2023-05-28 19:46:09 -04:00
user1
f2b41c60ff Cleaning up prior to submitting ControlNet PR. Mostly turning off diagnostic printing. Also fixed error when there is no controlnet input. 2023-05-26 21:44:00 -04:00
user1
11fc7e40a5 Refactored ControNet support to consolidate multiple parameters into data struct. Also redid how multiple controlnets are handled. 2023-05-26 21:44:00 -04:00
user1
e2a94be336 Added resizing of controlnet image based on noise latent. Fixes a tensor mismatch issue. 2023-05-26 21:44:00 -04:00
user1
901a277959 Core implementation of ControlNet and MultiControlNet. 2023-05-26 21:44:00 -04:00
user1
a2a2cfa765 Added resizing of controlnet image based on noise latent. Fixes a tensor mismatch issue. 2023-05-26 21:44:00 -04:00
user1
768cfe3aab Core implementation of ControlNet and MultiControlNet. 2023-05-26 21:44:00 -04:00
user1
297931f5d9 Cleaning up prior to submitting ControlNet PR. Mostly turning off diagnostic printing. Also fixed error when there is no controlnet input. 2023-05-26 21:44:00 -04:00
user1
f613c073c1 Added support for specifying which step iteration to start using
each ControlNet, and which step to end using each controlnet (specified as fraction of total steps)
2023-05-26 21:44:00 -04:00
user1
63d248622c Refactored ControNet support to consolidate multiple parameters into data struct. Also redid how multiple controlnets are handled. 2023-05-26 21:44:00 -04:00
user1
0096fb2790 Added resizing of controlnet image based on noise latent. Fixes a tensor mismatch issue. 2023-05-26 21:44:00 -04:00
user1
940e3b6635 Core implementation of ControlNet and MultiControlNet. 2023-05-26 21:44:00 -04:00
user1
714ad6dbb8 Fixed use of ControlNet control_weight parameter 2023-05-26 21:44:00 -04:00
user1
5d5cdc7716 Added resizing of controlnet image based on noise latent. Fixes a tensor mismatch issue. 2023-05-26 21:44:00 -04:00
user1
5e4c0217c7 Switching to ControlField for output from controlnet nodes. 2023-05-26 21:44:00 -04:00
user1
a91dee87d0 Added support for ControlNet and MultiControlNet to legacy non-nodal Txt2Img in backend/generator. Although backend/generator will likely disappear by v3.x, right now they are very useful for testing core ControlNet and MultiControlNet functionality while node codebase is rapidly evolving. 2023-05-26 21:44:00 -04:00
user1
5ff98a4179 Core implementation of ControlNet and MultiControlNet. 2023-05-26 21:44:00 -04:00
Lincoln Stein
5f8f51436a merge with main; fix conflicts 2023-05-25 22:40:45 -04:00
Lincoln Stein
5659d10778 remove unused function get_root() 2023-05-25 22:06:37 -04:00
Lincoln Stein
e56965ad76 documentation tweaks; fixed initialization in a couple more places 2023-05-25 21:10:00 -04:00
Lincoln Stein
2273b3a8c8 fix potential race condition in config system 2023-05-25 20:41:26 -04:00
Lincoln Stein
9110838fe4
Merge branch 'main' into release/make-web-dist-startable 2023-05-25 19:06:09 -04:00
Lincoln Stein
ca7b267326 raise error if syslogging requested and syslog lib not available 2023-05-25 10:10:46 -04:00
Lincoln Stein
88776fb2de get invokeai_configure working again 2023-05-25 09:39:45 -04:00
Lincoln Stein
b87f3043ae add logging configuration 2023-05-24 23:57:15 -04:00
psychedelicious
d14b02e93f feat(logger): fix logger type issues 2023-05-24 11:30:47 -04:00
psychedelicious
fb0b63c580 fix(nodes): fix seam painting
The problem was the same seed was getting used for the seam painting pass, causing the fried look.

Same issue as if you do img2img on a txt2img with the same seed/prompt.

Thanks to @hipsterusername for teaming up to debug this. We got pretty deep into the weeds.
2023-05-25 00:58:03 +10:00
Sergey Borisov
8e419a4f97 Revert weak references as can be done without it 2023-05-23 04:29:40 +03:00
Sergey Borisov
2533209326 Rewrite cache to weak references 2023-05-23 03:48:22 +03:00
Lincoln Stein
d2dc1ed26f make InvokeAI package installable
This commit makes InvokeAI 3.0 to be installable via PyPi.org and the
installer script.

Main changes.

1. Move static web pages into `invokeai/frontend/web` and modify the
API to look for them there. This allows pip to copy the files into the
distribution directory so that user no longer has to be in repo root
to launch.

2. Update invoke.sh and invoke.bat to launch the new web application
properly. This also changes the wording for launching the CLI from
"generate images" to "explore the InvokeAI node system," since I would
not recommend using the CLI to generate images routinely.

3. Fix a bug in the checkpoint converter script that was identified
during testing.

4. Better error reporting when checkpoint converter fails.

5. Rebuild front end.
2023-05-22 17:51:47 -04:00
Lincoln Stein
27241cdde1 port more globals changes over 2023-05-18 17:17:45 -04:00
Lincoln Stein
259d6ec90d fixup cachedir call 2023-05-18 14:52:16 -04:00
Lincoln Stein
a77c4c87b2 fixed logic error in resolution of model path 2023-05-18 14:35:34 -04:00
Lincoln Stein
d96175d127 resolve some undefined symbols in model_cache 2023-05-18 14:31:47 -04:00
Lincoln Stein
b1a99d772c added method to convert vaes 2023-05-18 13:31:11 -04:00
Lincoln Stein
7ea995149e fixes to env parsing, textual inversion & help text
- Make environment variable settings case InSenSiTive:
  INVOKEAI_MAX_LOADED_MODELS and InvokeAI_Max_Loaded_Models
  environment variables will both set `max_loaded_models`

- Updated realesrgan to use new config system.

- Updated textual_inversion_training to use new config system.

- Discovered a race condition when InvokeAIAppConfig is created
  at module load time, which makes it impossible to customize
  or replace the help message produced with --help on the command
  line. To fix this, moved all instances of get_invokeai_config()
  from module load time to object initialization time. Makes code
  cleaner, too.

- Added `--from_file` argument to `invokeai-node-cli` and changed
  github action to match. CI tests will hopefully work now.
2023-05-18 10:48:23 -04:00
Sergey Borisov
fd82763412 Model manager draft 2023-05-18 03:56:52 +03:00
Eugene
f9710dd6ed remove reference to legacy opt.hf_token, clean up whitespace in invokeai_configure 2023-05-17 20:39:00 -04:00
Lincoln Stein
8adff96e29
Merge branch 'main' into lstein/global-configuration 2023-05-17 14:37:09 -04:00
Lincoln Stein
7593dc19d6 complete several steps needed to make 3.0 installable
- invokeai-configure updated to work with new config system
- migrate invokeai.init to invokeai.yaml during configure
- replace legacy invokeai with invokeai-node-cli
- add ability to run an invocation directly from invokeai-node-cli command line
- update CI tests to work with new invokeai syntax
2023-05-17 14:13:27 -04:00
Lincoln Stein
b7c5a39685 make invokeai.yaml more hierarchical; fix list configuration bug 2023-05-17 12:19:19 -04:00
Lincoln Stein
eadfd239a8 update config script to work with new config system 2023-05-17 00:18:19 -04:00
Lincoln Stein
e971a7f35c when migrating models.yaml, rename original models.yaml.orig 2023-05-16 22:37:53 -04:00
Lincoln Stein
8d75e50435 partial port of invokeai-configure 2023-05-16 01:50:01 -04:00
Lincoln Stein
cd16857f38 fix None in model_type 2023-05-16 00:13:44 -04:00
Lincoln Stein
1442f1cb8d change model filter to None in second place 2023-05-16 00:03:57 -04:00
Lincoln Stein
4fe94a9315 list_models() now returns a dict of {type,{name: info}} 2023-05-15 23:44:08 -04:00
Lincoln Stein
7ef0d2aa35 merge with main 2023-05-15 09:07:17 -04:00
Lincoln Stein
c8f765cc06 improve debugging messages 2023-05-14 18:29:55 -04:00
Lincoln Stein
b9e9087dbe do not manage GPU for pipelines if sequential_offloading is True 2023-05-14 18:09:38 -04:00
Lincoln Stein
63e465eb5c tweaks to get_model() behavior
1. If an external VAE is specified in config file, then
   get_model(submodel=vae) will return the external VAE, not the one
   burnt into the parent diffusers pipeline.

2. The mechanism in (1) is generalized such that you can now have
   "unet:", "text_encoder:" and similar stanzas in the config file.
   Valid formats of these subsections:

       unet:
          repo_id: foo/bar

       unet:
          path: /path/to/local/folder

       unet:
          repo_id: foo/bar
	  subfolder: unet

    In the near future, these will also be used to attach external
    parts to the pipeline, generalizing VAE behavior.

3. Accommodate callers (i.e. the WebUI) that are passing the
   model key ("diffusers/stable-diffusion-1.5") to get_model()
   instead of the tuple of model_name and model_type.

4. Fixed bug in VAE model attaching code.

5. Rebuilt web front end.
2023-05-14 16:50:59 -04:00
Eugene Brodsky
c8a98a9a22
Merge branch 'main' into lstein/bugfix/compel 2023-05-14 14:43:18 -04:00
blessedcoolant
c4681774a5
Merge branch 'main' into logging-facelift 2023-05-15 02:08:29 +12:00
Damian Stewart
050add58d2
fix getting conditionings 2023-05-14 12:20:54 +02:00
Eugene
b72c9787a9 Revert "comment out customer_attention_context"
This reverts commit 8f8cd90787.

Due to NameError: name 'options' is not defined
2023-05-14 00:37:55 -04:00
Eugene Brodsky
2623941d91
Merge branch 'main' into lstein/bugfix/compel 2023-05-13 22:23:59 -04:00
Lincoln Stein
baf5451fa0
Merge branch 'main' into lstein/new-model-manager 2023-05-13 22:01:34 -04:00
blessedcoolant
026d3260b4 Add Heun Karras Scheduler 2023-05-14 11:45:08 +10:00
Lincoln Stein
1103ab2844 merge with main 2023-05-13 21:35:19 -04:00
Lincoln Stein
b31a6ff605 fix reversed args in _model_key() call 2023-05-13 21:11:06 -04:00
Sergey Borisov
1f602e6143 Fix - apply precision to text_encoder 2023-05-14 03:46:13 +03:00
Sergey Borisov
039fa73269 Change SDModelType enum to string, fixes(model unload negative locks count, scheduler load error, saftensors convert, wrong logic in del_model, wrong parse metadata in web) 2023-05-14 03:06:26 +03:00
blessedcoolant
691e1bf829 Make debug messages cyan/blue 2023-05-14 09:06:57 +12:00
Lincoln Stein
2204e47596 allow submodels to be fetched independent of parent pipeline 2023-05-13 16:54:47 -04:00
Lincoln Stein
d8b1f29066 proxy SDModelInfo so that it can be used directly as context 2023-05-13 16:29:18 -04:00
Lincoln Stein
b23c9f1da5 get Tuple type hint syntax right 2023-05-13 14:59:21 -04:00
Lincoln Stein
72967bf118 convert add_model(), del_model(), list_models() etc to use bifurcated names 2023-05-13 14:44:44 -04:00
Sergey Borisov
3b2a054f7a Add model loader node; unet, clip, vae fields; change compel node to clip field 2023-05-13 04:37:20 +03:00
Sergey Borisov
131145eab1 A big refactor of model manager(according to IMHO) 2023-05-12 23:13:34 +03:00
Kent Keirsey
8f8cd90787 comment out customer_attention_context 2023-05-12 13:59:00 -04:00
blessedcoolant
d796ea7bec feat: Logging Improvements 2023-05-13 02:13:49 +12:00
Eugene Brodsky
af060188bd
Merge branch 'main' into lstein/bugfix/compel 2023-05-12 08:22:18 -04:00
Kevin Turner
4caa1f19b2 fix(model manager): fix string formatting error on model checksum timer 2023-05-11 19:06:02 -07:00
Lincoln Stein
df5b968954 model manager now running as a service 2023-05-11 21:24:29 -04:00
Lincoln Stein
95d4bd3012 Merge branch 'lstein/bugfix/compel' of github.com:invoke-ai/InvokeAI into lstein/bugfix/compel 2023-05-11 21:13:29 -04:00
Lincoln Stein
037078c8ad make InvokeAIDiffuserComponent.custom_attention_control a classmethod 2023-05-11 21:13:18 -04:00
blessedcoolant
f7dc171c4f Rename default schedulers across the app 2023-05-12 03:44:20 +12:00
blessedcoolant
4b957edfec Add DDPM Scheduler 2023-05-12 03:18:34 +12:00
blessedcoolant
46ca7718d9 Add DEIS Scheduler 2023-05-12 03:10:30 +12:00
blessedcoolant
b928d7a6e6 Change scheduler names to be accurate
_a = Ancestral
_k = Karras
2023-05-12 02:59:43 +12:00
blessedcoolant
8a836247c8 Add DPMPP Single, Euler Karras and DPMPP2 Multi Karras Schedulers 2023-05-12 02:23:33 +12:00
blessedcoolant
9a383e456d Codesplit SCHEDULER_MAP for reusage 2023-05-12 00:40:03 +12:00
blessedcoolant
3ffff023b2 Add missing key to scheduler_map
It was breaking coz the sampler was not being reset. So needs a key on each. Will simplify this later.
2023-05-12 00:08:50 +12:00
blessedcoolant
d1029138d2 Default to DDIM if scheduler is missing 2023-05-11 22:54:35 +12:00
blessedcoolant
06b5800d28 Add UniPC Scheduler 2023-05-11 22:43:18 +12:00
Eugene Brodsky
3baa230077
Merge branch 'main' into lstein/bugfix/compel 2023-05-11 00:50:45 -04:00
Eugene
9e594f9018 pad conditioning tensors to same length
fixes crash when prompt length is greater than 75 tokens
2023-05-11 00:34:15 -04:00
Lincoln Stein
8ad8c5c67a resolve conflicts with main 2023-05-11 00:19:20 -04:00
Lincoln Stein
4627910c5d added a wrapper model_manager_service and model events 2023-05-11 00:09:19 -04:00
psychedelicious
49db6f4fac fix(nodes): fix trivial typing issues 2023-05-11 11:55:51 +10:00
psychedelicious
206e6b1730 feat(nodes): wip inpaint node 2023-05-11 11:55:51 +10:00
Lincoln Stein
5d5157fc65 make conditioning.py work with compel 1.1.5 2023-05-10 18:08:33 -04:00
Lincoln Stein
99c692f397 check that model name matches format 2023-05-09 23:46:59 -04:00
Lincoln Stein
3d85e769ce clean up ckpt handling
- remove legacy ckpt loading code from model_cache
- added placeholders for lora and textual inversion model loading
2023-05-09 22:44:58 -04:00
Lincoln Stein
9cb962cad7 ckpt model conversion now done in ModelCache 2023-05-08 23:39:44 -04:00
Lincoln Stein
a108155544 added StALKeR779's great model size calculating routine 2023-05-08 21:47:03 -04:00
Lincoln Stein
c15b49c805 implement StALKeR7779 requested API for fetching submodels 2023-05-07 23:18:17 -04:00
Lincoln Stein
fd63e36822 optimize subfolder so that it returns submodel if parent is in RAM 2023-05-07 21:39:11 -04:00
Lincoln Stein
4649920074 adjust t2i to work with new model structure 2023-05-07 19:06:49 -04:00
Lincoln Stein
667171ed90 cap model cache size using bytes, not # models 2023-05-07 18:07:28 -04:00