-
-
Notifications
You must be signed in to change notification settings - Fork 848
Description
**Describe the
hoping that comes out
I started testing a new 1.5 model this is the dump
06:30:38.436 INFO AnyIO worker thread Scanning all model folders for models... model_manager.py:418
06:30:39.401 INFO AnyIO worker thread Scanned 268 models. Nothing infected model_manager.py:429
06:31:02.735 INFO cuda Session 1766662237799 starting task 2147733215120 on NVIDIA GeForce RTX 4070 task_manager.py:283
06:31:02.892 INFO cuda loading stable-diffusion model from init.py:52
B:\AIEasyDiffusionBeta\models\stable-diffusion\NEW 1.5 bravoSpaceVoyage_v20.safetensors to device: cuda
06:31:40.256 WARNING cuda WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built cpp_lib.py:144
for:
PyTorch 2.2.2+cu121 with CUDA 1201 (you have 2.6.0+cu124)
Python 3.9.13 (you have 3.9.21)
Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details
06:31:40.690 WARNING cuda A matching Triton is not available, some optimizations will not be enabled init.py:59
Traceback (most recent call last):
File "B:\AIEasyDiffusionBeta\installer_files\env\lib\site-packages\xformers_init.py", line 55, in
_is_triton_available
from xformers.triton.softmax import softmax as triton_softmax # noqa
File "B:\AIEasyDiffusionBeta\installer_files\env\lib\site-packages\xformers\triton\softmax.py", line
11, in
import triton
ModuleNotFoundError: No module named 'triton'
B:\AIEasyDiffusionBeta\installer_files\env\lib\site-packages\xformers\ops\swiglu_op.py:107: FutureWarning: torch.cuda.amp.custom_fwd(args...) is deprecated. Please use torch.amp.custom_fwd(args..., device_type='cuda') instead.
def forward(cls, ctx, x, w1, b1, w2, b2, w3, b3):
B:\AIEasyDiffusionBeta\installer_files\env\lib\site-packages\xformers\ops\swiglu_op.py:128: FutureWarning: torch.cuda.amp.custom_bwd(args...) is deprecated. Please use torch.amp.custom_bwd(args..., device_type='cuda') instead.
def backward(cls, ctx, dx5):
B:\AIEasyDiffusionBeta\installer_files\env\lib\site-packages\diffusers\models\transformers\transformer_2d.py:34: FutureWarning: Transformer2DModelOutput is deprecated and will be removed in version 1.0.0. Importing Transformer2DModelOutput from diffusers.models.transformer_2d is deprecated and this will be removed in a future version. Please use from diffusers.models.modeling_outputs import Transformer2DModelOutput, instead.
deprecate("Transformer2DModelOutput", "1.0.0", deprecation_message)
06:31:49.503 INFO cuda loading on diffusers init.py:168
06:31:49.504 INFO cuda using config: init.py:170
B:\AIEasyDiffusionBeta\installer_files\env\lib\site-packages\sdkit\models\models_db\configs\v1-inference
.yaml
06:31:49.512 INFO cuda using attn_precision: fp16
I haven't been able to get xformers to install at all, each time I tried I got a fail, and I assumed it wasn't active. And I never got this error until just now I would love to get it installed though. It used to work. I tried getting it in a month or more ago and the AI had be doing loops trying to get it in. I finally gave up and now it popped up again for some reason.
bug**
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
- Go to '...'
- Click on '....'
- Scroll down to '....'
- See error
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
- OS:
- Browser:
- Version:
Smartphone (please complete the following information):
- Device:
- OS:
- Browser
- Version
Additional context
Add any other context about the problem here.