Forum Replies Created
-
AuthorPosts
-
aiexpert_adm
KeymasterHello Jacob! Based on your most recent message, I believe you were able to find the document, correct?
If you are still having difficulty finding any resource please let us know.
aiexpert_adm
KeymasterHi!
A solution that helped other people who had this exact problem is to run this command here:” pip install –force-reinstall –no-deps –pre xformers ”
If you still receive some message about xformers not loaded, add “–xformers” to “COMMANDLINE_ARGS=”
To do this, use:
” set COMMANDLINE_ARGS= –xformers –opt-sdp-no-mem-attention –listen –enable-insecure-extension-access ”
If that doesn’t work, don’t worry, try this exact method instead:
https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Xformers
(this video can help too https://www.youtube.com/watch?v=ZVqalCax6MA but prefer the instructions above, follow exactly as shown.)Please let us know if still occurs any problem.
Another thing you could do is skip the line where you use the xformers method (enable_xformers_memory_efficient_attention), cause its use is not mandatory, however it is interesting to use it as it optimizes memory usage.
aiexpert_adm
KeymasterHi Bruno!
Yes, you can use the same code. What may be different are the library installation commands, which may vary depending on your operating system and environment.
Did you install pytorch using pip or conda?
It sounds like you installed pytorch without CUDA support.
You could install CUDA from NVIDIA’s website -> https://developer.nvidia.com/cuda-downloads
After the install ends, open a new terminal and check your cuda version with this command: nvcc –versionThen, go here https://pytorch.org/get-started/locally/
and select your OS and preferred package manager (pip or anaconda), and the CUDA version you installed, then copy the generated install command and execute in your terminal. Just remember to uninstall torch first.Please let us know if the problem persists
aiexpert_adm
KeymasterHi Lorant!
To solve this error and also use CUDA on Colab, insert the command below at the beginning of your Colab:
!pip install “jax[cuda12_local]==0.4.23” -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
Then you can continue to run the rest of the code normally, in order. If you prefer, check out the Colab here, which is updated and therefore contains this command at the beginning of the file
https://colab.research.google.com/drive/1aoM-30yLodra-xjRtyPaOORvfbAnUxeJ
By the way, this message started to appear last week, due to an update to the library that is used by the Stable Diffusion implementation. -
AuthorPosts