mirror of
https://github.com/invoke-ai/InvokeAI
synced 2024-08-30 20:32:17 +00:00
Update docs/installation/030_INSTALL_CUDA_AND_ROCM.md
Co-authored-by: Eugene Brodsky <ebr@users.noreply.github.com>
This commit is contained in:
parent
c6a2ba12e2
commit
4d74af2363
@ -49,10 +49,10 @@ running `nvidia-smi` from the command line.
|
||||
|
||||
### Linux Install with a Runtime Container
|
||||
|
||||
On Linux systems, an alternative to installing the driver directly on
|
||||
your system is to run an NVIDIA software container that has the driver
|
||||
already in place. This is recommended if you are already familiar with
|
||||
containerization technologies such as Docker.
|
||||
On Linux systems, an alternative to installing CUDA Toolkit directly on
|
||||
your system is to run an NVIDIA software container that has the CUDA
|
||||
libraries already in place. This is recommended if you are already
|
||||
familiar with containerization technologies such as Docker.
|
||||
|
||||
For downloads and instructions, visit the [NVIDIA CUDA Container
|
||||
Runtime Site](https://developer.nvidia.com/nvidia-container-runtime)
|
||||
|
Loading…
Reference in New Issue
Block a user