InvokeAI/invokeai/backend/model_management
Lincoln Stein 7ea995149e fixes to env parsing, textual inversion & help text
- Make environment variable settings case InSenSiTive:
  INVOKEAI_MAX_LOADED_MODELS and InvokeAI_Max_Loaded_Models
  environment variables will both set `max_loaded_models`

- Updated realesrgan to use new config system.

- Updated textual_inversion_training to use new config system.

- Discovered a race condition when InvokeAIAppConfig is created
  at module load time, which makes it impossible to customize
  or replace the help message produced with --help on the command
  line. To fix this, moved all instances of get_invokeai_config()
  from module load time to object initialization time. Makes code
  cleaner, too.

- Added `--from_file` argument to `invokeai-node-cli` and changed
  github action to match. CI tests will hopefully work now.
2023-05-18 10:48:23 -04:00
..
__init__.py Merge branch 'main' into bugfix/prevent-cli-crash 2023-04-07 18:55:54 -04:00
convert_ckpt_to_diffusers.py fixes to env parsing, textual inversion & help text 2023-05-18 10:48:23 -04:00
model_manager.py fixes to env parsing, textual inversion & help text 2023-05-18 10:48:23 -04:00