Most systems do best with an explicit value for this the `MALLOC_MMAP_THRESHOLD_` env var, but some may want to change or unset this variable.
Valid values are a positive integer or "unset" to unset the variable.
Closes#6007.
- Restructure & update code check workflows
- Add release workflow to handle checks/tests, build and publish to PyPI
- Add docs/RELEASE.md explaining the workflow & process
- `create_installer.sh`: Update to work with the release workflow
- `create_installer.sh` & `tag_release.sh`: Fix the ANSI escape codes for macOS
- `tag_release.sh`: Add check for python binary name
- `tag_release.sh`: Print `git remote -v` output
- `tag_release.sh`: Fix error when deleting nonexistant tags
This has repeatedly shown itself useful in fixing install issues,
especially regarding pytorch CPU/GPU version, so there is little
downside to making this the default.
Performance impact of this should be negligible. Packages will
be reinstalled from pip cache if possible, and downloaded only if
necessary. Impact may be felt on slower disks.
Organise deps into ~3 categories:
- Core generation dependencies, pinned for reproducible builds.
- Core application dependencies, pinned for reproducible builds.
- Auxiliary dependencies, pinned only if necessary.
I pinned / bumped these to latest:
- `controlnet_aux`
- `fastapi`
- `fastapi-events`
- `huggingface-hub`
- `numpy`
- `python-socketio`
- `torchmetrics`
- `transformers`
- `uvicorn`
I checked the release notes for these and didn't see any breaking changes that would affect us. There is a `fastapi` breaking change in v108 related to background tasks but it doesn't affect us.
I tested on a fresh venv. The app still works and I can generate on macOS.
Hopefully, enforcing explicit pinned versions will reduce the issues where people get CPU torch.
It also means we should periodically bump versions up to ensure we don't get too far behind on our dependencies and have to do painful upgrades.