InvokeAI/ldm/invoke
Lincoln Stein 3929bd3e13
Lstein release candidate 2.2.5 (#2137)
* installer tweaks in preparation for v2.2.5

- pin numpy to 1.23.* to avoid requirements conflict with numba
- update.sh and update.bat now accept a tag or branch string, not a URL
- update scripts download latest requirements-base before updating.

* update.bat.in debugged and working

* update pulls from "latest" now

* bump version number

* fix permissions on create_installer.sh

* give Linux user option of installing ROCm or CUDA

* rc2.2.5 (install.sh) relative path fixes (#2155)

* (installer) fix bug in resolution of relative paths in linux install script

point installer at 2.2.5-rc1

selecting ~/Data/myapps/ as location  would create a ./~/Data/myapps
instead of expanding the ~/ to the value of ${HOME}

also, squash the trailing slash in path, if it was entered by the user

* (installer) add option to automatically start the app after install

also: when exiting, print the command to get back into the app

* remove extraneous whitespace

* model_cache applies rootdir to config path

* bring installers up to date with 2.2.5-rc2

* bump rc version

* create_installer now adds version number

* rebuild frontend

* bump rc#

* add locales to frontend dist package

- bump to patchlevel 6

* bump patchlevel

* use invoke-ai version of GFPGAN

- This version is very slightly modified to allow weights files
  to be pre-downloaded by the configure script.

* fix formatting error during startup

* bump patch level

* workaround #2 for GFPGAN facexlib() weights downloading

* bump patch

* ready for merge and release

* remove extraneous comment

* set PYTORCH_ENABLE_MPS_FALLBACK directly in invoke.py

Co-authored-by: Eugene Brodsky <ebr@users.noreply.github.com>
2023-01-01 17:54:45 +00:00
..
generator correct a crash in img2img under particular circumstances (#2088) 2022-12-22 14:53:23 +00:00
restoration Lstein release candidate 2.2.5 (#2137) 2023-01-01 17:54:45 +00:00
__init__.py add --version to invoke.py arguments (#2038) 2022-12-20 15:14:28 +00:00
args.py [docs] Provide an example of reading prompts from a script (#2087) 2022-12-23 14:06:59 +00:00
CLI.py Lstein release candidate 2.2.5 (#2137) 2023-01-01 17:54:45 +00:00
concepts_lib.py Global replace [ \t]+$, add "GB" (#1751) 2022-12-19 16:36:39 +00:00
conditioning.py restrict to 75 tokens and correctly handle blends 2022-12-14 16:54:27 -05:00
devices.py rename all modules from ldm.dream to ldm.invoke 2022-10-08 11:37:23 -04:00
globals.py defer patchmatch loading (#2039) 2022-12-20 15:32:35 -08:00
image_util.py Global replace [ \t]+$, add "GB" (#1751) 2022-12-19 16:36:39 +00:00
log.py rename all modules from ldm.dream to ldm.invoke 2022-10-08 11:37:23 -04:00
model_cache.py Lstein release candidate 2.2.5 (#2137) 2023-01-01 17:54:45 +00:00
patchmatch.py defer patchmatch loading (#2039) 2022-12-20 15:32:35 -08:00
pngwriter.py correct bug when trying to enhance JPG images (#1928) 2022-12-11 13:48:47 -05:00
prompt_parser.py Save and display per-token attention maps (#1866) 2022-12-10 15:57:41 +01:00
readline.py Save and display per-token attention maps (#1866) 2022-12-10 15:57:41 +01:00
seamless.py Global replace [ \t]+$, add "GB" (#1751) 2022-12-19 16:36:39 +00:00
server_legacy.py Global replace [ \t]+$, add "GB" (#1751) 2022-12-19 16:36:39 +00:00
server.py Global replace [ \t]+$, add "GB" (#1751) 2022-12-19 16:36:39 +00:00
txt2mask.py Global replace [ \t]+$, add "GB" (#1751) 2022-12-19 16:36:39 +00:00