mirror of
https://github.com/invoke-ai/InvokeAI
synced 2024-08-30 20:32:17 +00:00
xformers==0.0.20 (#4881)
I'm not sure if it's correct way of handling things, but correcting this string to '==0.0.20' fixes xformers install for me - and maybe for others it will too. Sorry for absolutely incorrect PR. Please see [this thread](https://github.com/facebookresearch/xformers/issues/740), this is the issue I had (trying to install InvokeAI with Automatic/Manual/StableMatrix way). With ~=0.0.19 (0.0.22): ``` (InvokeAI) pip install torch torchvision xformers~=0.0.19 Collecting torch Obtaining dependency information for torch fromedce54779f/torch-2.1.0-cp311-cp311-win_amd64.whl.metadata
Using cached torch-2.1.0-cp311-cp311-win_amd64.whl.metadata (25 kB) Collecting torchvision Obtaining dependency information for torchvision fromab6f42af83/torchvision-0.16.0-cp311-cp311-win_amd64.whl.metadata
Using cached torchvision-0.16.0-cp311-cp311-win_amd64.whl.metadata (6.6 kB) Collecting xformers Using cached xformers-0.0.22.post3.tar.gz (3.9 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error × Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> [20 lines of output] Traceback (most recent call last): File "C:\Users\Drun\invokeai\.venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module> main() File "C:\Users\Drun\invokeai\.venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main json_out['return_val'] = hook(**hook_input['kwargs']) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Drun\invokeai\.venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel return hook(config_settings) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Drun\AppData\Local\Temp\pip-build-env-rmhvraqj\overlay\Lib\site-packages\setuptools\build_meta.py", line 355, in get_requires_for_build_wheel return self._get_build_requires(config_settings, requirements=['wheel']) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Drun\AppData\Local\Temp\pip-build-env-rmhvraqj\overlay\Lib\site-packages\setuptools\build_meta.py", line 325, in _get_build_requires self.run_setup() File "C:\Users\Drun\AppData\Local\Temp\pip-build-env-rmhvraqj\overlay\Lib\site-packages\setuptools\build_meta.py", line 507, in run_setup super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script) File "C:\Users\Drun\AppData\Local\Temp\pip-build-env-rmhvraqj\overlay\Lib\site-packages\setuptools\build_meta.py", line 341, in run_setup exec(code, locals()) File "<string>", line 23, in <module> ModuleNotFoundError: No module named 'torch' ``` With 0.0.20: ``` (InvokeAI) pip install torch torchvision xformers==0.0.20 Collecting torch Obtaining dependency information for torch fromedce54779f/torch-2.1.0-cp311-cp311-win_amd64.whl.metadata
Using cached torch-2.1.0-cp311-cp311-win_amd64.whl.metadata (25 kB) Collecting torchvision Obtaining dependency information for torchvision fromab6f42af83/torchvision-0.16.0-cp311-cp311-win_amd64.whl.metadata
Using cached torchvision-0.16.0-cp311-cp311-win_amd64.whl.metadata (6.6 kB) Collecting xformers==0.0.20 Obtaining dependency information for xformers==0.0.20 fromd4a42f582a/xformers-0.0.20-cp311-cp311-win_amd64.whl.metadata
Using cached xformers-0.0.20-cp311-cp311-win_amd64.whl.metadata (1.1 kB) Collecting numpy (from xformers==0.0.20) Obtaining dependency information for numpy from3f826c6d15/numpy-1.26.0-cp311-cp311-win_amd64.whl.metadata
Using cached numpy-1.26.0-cp311-cp311-win_amd64.whl.metadata (61 kB) Collecting pyre-extensions==0.0.29 (from xformers==0.0.20) Using cached pyre_extensions-0.0.29-py3-none-any.whl (12 kB) Collecting torch Using cached torch-2.0.1-cp311-cp311-win_amd64.whl (172.3 MB) Collecting filelock (from torch) Obtaining dependency information for filelock from97afbafd9d/filelock-3.12.4-py3-none-any.whl.metadata
Using cached filelock-3.12.4-py3-none-any.whl.metadata (2.8 kB) Requirement already satisfied: typing-extensions in c:\users\drun\invokeai\.venv\lib\site-packages (from torch) (4.8.0) Requirement already satisfied: sympy in c:\users\drun\invokeai\.venv\lib\site-packages (from torch) (1.12) Collecting networkx (from torch) Using cached networkx-3.1-py3-none-any.whl (2.1 MB) Collecting jinja2 (from torch) Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB) Collecting typing-inspect (from pyre-extensions==0.0.29->xformers==0.0.20) Obtaining dependency information for typing-inspect from107a22063b/typing_inspect-0.9.0-py3-none-any.whl.metadata
Using cached typing_inspect-0.9.0-py3-none-any.whl.metadata (1.5 kB) Collecting requests (from torchvision) Obtaining dependency information for requests from0e2d847013/requests-2.31.0-py3-none-any.whl.metadata
Using cached requests-2.31.0-py3-none-any.whl.metadata (4.6 kB) INFO: pip is looking at multiple versions of torchvision to determine which version is compatible with other requirements. This could take a while. Collecting torchvision Using cached torchvision-0.15.2-cp311-cp311-win_amd64.whl (1.2 MB) Collecting pillow!=8.3.*,>=5.3.0 (from torchvision) Obtaining dependency information for pillow!=8.3.*,>=5.3.0 fromdebe992677/Pillow-10.0.1-cp311-cp311-win_amd64.whl.metadata
Using cached Pillow-10.0.1-cp311-cp311-win_amd64.whl.metadata (9.6 kB) Collecting MarkupSafe>=2.0 (from jinja2->torch) Obtaining dependency information for MarkupSafe>=2.0 from08b85bc194/MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl.metadata
Using cached MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl.metadata (3.1 kB) Collecting charset-normalizer<4,>=2 (from requests->torchvision) Obtaining dependency information for charset-normalizer<4,>=2 from50028bbb26/charset_normalizer-3.3.0-cp311-cp311-win_amd64.whl.metadata
Using cached charset_normalizer-3.3.0-cp311-cp311-win_amd64.whl.metadata (33 kB) Collecting idna<4,>=2.5 (from requests->torchvision) Using cached idna-3.4-py3-none-any.whl (61 kB) Collecting urllib3<3,>=1.21.1 (from requests->torchvision) Obtaining dependency information for urllib3<3,>=1.21.1 from9957270221/urllib3-2.0.6-py3-none-any.whl.metadata
Using cached urllib3-2.0.6-py3-none-any.whl.metadata (6.6 kB) Collecting certifi>=2017.4.17 (from requests->torchvision) Obtaining dependency information for certifi>=2017.4.17 from2234eab223/certifi-2023.7.22-py3-none-any.whl.metadata
Using cached certifi-2023.7.22-py3-none-any.whl.metadata (2.2 kB) Requirement already satisfied: mpmath>=0.19 in c:\users\drun\invokeai\.venv\lib\site-packages (from sympy->torch) (1.3.0) Collecting mypy-extensions>=0.3.0 (from typing-inspect->pyre-extensions==0.0.29->xformers==0.0.20) Using cached mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB) Using cached xformers-0.0.20-cp311-cp311-win_amd64.whl (97.6 MB) Using cached Pillow-10.0.1-cp311-cp311-win_amd64.whl (2.5 MB) Using cached filelock-3.12.4-py3-none-any.whl (11 kB) Using cached numpy-1.26.0-cp311-cp311-win_amd64.whl (15.8 MB) Using cached requests-2.31.0-py3-none-any.whl (62 kB) Using cached certifi-2023.7.22-py3-none-any.whl (158 kB) Using cached charset_normalizer-3.3.0-cp311-cp311-win_amd64.whl (97 kB) Using cached MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl (17 kB) Using cached urllib3-2.0.6-py3-none-any.whl (123 kB) Using cached typing_inspect-0.9.0-py3-none-any.whl (8.8 kB) Installing collected packages: urllib3, pillow, numpy, networkx, mypy-extensions, MarkupSafe, idna, filelock, charset-normalizer, certifi, typing-inspect, requests, jinja2, torch, pyre-extensions, xformers, torchvision Successfully installed MarkupSafe-2.1.3 certifi-2023.7.22 charset-normalizer-3.3.0 filelock-3.12.4 idna-3.4 jinja2-3.1.2 mypy-extensions-1.0.0 networkx-3.1 numpy-1.26.0 pillow-10.0.1 pyre-extensions-0.0.29 requests-2.31.0 torch-2.0.1 torchvision-0.15.2 typing-inspect-0.9.0 urllib3-2.0.6 xformers-0.0.20 ``` ## What type of PR is this? (check all applicable) - [ ] Refactor - [ ] Feature - [x] Bug Fix - [ ] Optimization - [ ] Documentation Update - [ ] Community Node Submission ## Have you discussed this change with the InvokeAI team? - [ ] Yes - [x] No, because: I'm no-brainer. It fixed issue for me, so I did PR. Who knows? ## Technical details: Windows 11, Standalone clean and freshly-installed Python 3.11
This commit is contained in:
commit
840cbc1d39
@ -109,8 +109,8 @@ dependencies = [
|
||||
"pytest-datadir",
|
||||
]
|
||||
"xformers" = [
|
||||
"xformers~=0.0.19; sys_platform!='darwin'",
|
||||
"triton; sys_platform=='linux'",
|
||||
"xformers==0.0.21; sys_platform!='darwin'",
|
||||
"triton; sys_platform=='linux'",
|
||||
]
|
||||
"onnx" = ["onnxruntime"]
|
||||
"onnx-cuda" = ["onnxruntime-gpu"]
|
||||
|
Loading…
Reference in New Issue
Block a user