Forum Replies Created
-
AuthorPosts
-
Bruno Martin Bonifaz Cervetto
ParticipantHi Keymaster 3000!
Thanks for the indications! I was able to install CUDA succesfully:
“”nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2024 NVIDIA Corporation
Built on Thu_Mar_28_02:30:10_Pacific_Daylight_Time_2024
Cuda compilation tools, release 12.4, V12.4.131
Build cuda_12.4.r12.4/compiler.34097967_0″”I also installed the following code succesfully:
“”pip3 install torch torchvision torchaudio –index-url https://download.pytorch.org/whl/cu121″”
However, when i run the same code on python:
“”import torch
from diffusers import StableDiffusionPipeline
pipe = StableDiffusionPipeline.from_pretrained(“CompVis/stable-diffusion-v1-4″, torch_dtype=torch.float16)
pipe = pipe.to(‘cuda’)
pipe.enable_attention_slicing()
pipe.enable_xformers_memory_efficient_attention()
prompt = ‘an apple’
img = pipe(prompt).images[0]
type(img)
img
directory=r”C:\Users\bruno\OneDrive\Escritorio\Waifu\Logopaster”
img.save(‘result.png’)
“”, I receive the following error messages:
“”Traceback (most recent call last):
File “C:\Users\bruno\OneDrive\Escritorio\Waifu\AI EXPERT ACADEMY\Scripts\import stablediffusionpipeline.py”, line 8, in <module>
pipe.enable_xformers_memory_efficient_attention()
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\pipelines\pipeline_utils.py”, line 1569, in enable_xformers_memory_efficient_attention
self.set_use_memory_efficient_attention_xformers(True, attention_op)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\pipelines\pipeline_utils.py”, line 1595, in set_use_memory_efficient_attention_xformers
fn_recursive_set_mem_eff(module)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\pipelines\pipeline_utils.py”, line 1585, in fn_recursive_set_mem_eff
module.set_use_memory_efficient_attention_xformers(valid, attention_op)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\models\modeling_utils.py”, line 259, in set_use_memory_efficient_attention_xformers
fn_recursive_set_mem_eff(module)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\models\modeling_utils.py”, line 255, in fn_recursive_set_mem_eff
fn_recursive_set_mem_eff(child)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\models\modeling_utils.py”, line 255, in fn_recursive_set_mem_eff
fn_recursive_set_mem_eff(child)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\models\modeling_utils.py”, line 255, in fn_recursive_set_mem_eff
fn_recursive_set_mem_eff(child)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\models\modeling_utils.py”, line 252, in fn_recursive_set_mem_eff
module.set_use_memory_efficient_attention_xformers(valid, attention_op)
File “C:\Users\bruno\AppData\Local\Programs\Python\Python312\Lib\site-packages\diffusers\models\attention_processor.py”, line 253, in set_use_memory_efficient_attention_xformers
raise ModuleNotFoundError(
ModuleNotFoundError: Refer to https://github.com/facebookresearch/xformers for more information on how to install xformers””The problem is that I don’t know how to fix this problem. I have already tried reinstalling the module in CMD with:
“”–xformers –reinstall-xformers””
I have also tried installing it again with:
“”pip install -q accelerate transformers ftfy bitsandbytes gradio natsort safetensors xformers””
,but still when trying to see which version of xformers i have, on CMD it gives me with the following command:
“”python -m xformers.info””
, the following error:
“C:\Users\bruno\AppData\Local\Programs\Python\Python312\python.exe: Error while finding module specification for ‘xformers.info’ (ModuleNotFoundError: No module named ‘xformers’)”.
What should i do?
Thanks in advance for your help!
-
AuthorPosts