You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
FAILED: /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/build/temp.linux-x86_64-cpython-310/xformers/csrc/attention/hip_fmha/instances/fmha_batched_backward_bf16_has_mask_has_bias_has_biasgrad_has_dropout_maxk_256.o
/opt/rocm-6.3.2/bin/hipcc -I/tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/xformers/csrc -I/tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/xformers/csrc/attention/hip_fmha -I/tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/xformers/csrc/attention/hip_decoder -I/tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/third_party/composable_kernel_tiled/include -I/home/rrunner/ai/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/include -I/home/rrunner/ai/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -I/home/rrunner/ai/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/include/THH -I/opt/rocm-6.3.2/include -I/home/rrunner/ai/stable-diffusion-webui/venv/include -I/usr/include/python3.10 -c -c /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/xformers/csrc/attention/hip_fmha/instances/fmha_batched_backward_bf16_has_mask_has_bias_has_biasgrad_has_dropout_maxk_256.hip -o /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/build/temp.linux-x86_64-cpython-310/xformers/csrc/attention/hip_fmha/instances/fmha_batched_backward_bf16_has_mask_has_bias_has_biasgrad_has_dropout_maxk_256.o -fPIC -D__HIP_PLATFORM_AMD__=1 -DUSE_ROCM=1 -DHIPBLAS_V2 -DCUDA_HAS_FP16=1 -D__HIP_NO_HALF_OPERATORS__=1 -D__HIP_NO_HALF_CONVERSIONS__=1 -O3 -std=c++17 --offload-arch=native --offload-compress -U__CUDA_NO_HALF_OPERATORS__ -U__CUDA_NO_HALF_CONVERSIONS__ -DCK_TILE_FMHA_FWD_FAST_EXP2=1 -fgpu-flush-denormals-to-zero -Werror -Wc++11-narrowing -Woverloaded-virtual -mllvm -enable-post-misched=0 -mllvm -amdgpu-early-inline-all=true -mllvm -amdgpu-function-calls=false -mllvm -greedy-reverse-local-assignment=1 -DBUILD_PYTHON_PACKAGE -DTORCH_API_INCLUDE_EXTENSION_H '-DPYBIND11_COMPILER_TYPE="_gcc"' '-DPYBIND11_STDLIB="_libstdcpp"' '-DPYBIND11_BUILD_ABI="_cxxabi1016"' -DTORCH_EXTENSION_NAME=_C -D_GLIBCXX_USE_CXX11_ABI=1 -fno-gpu-rdc
In file included from /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/xformers/csrc/attention/hip_fmha/instances/fmha_batched_backward_bf16_has_mask_has_bias_has_biasgrad_has_dropout_maxk_256.hip:15:
In file included from /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/xformers/csrc/attention/hip_fmha/ck_tiled_fmha_batched_backward_hip.h:14:
In file included from /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/third_party/composable_kernel_tiled/include/ck_tile/ops/fmha_hip.hpp:19:
In file included from /tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/third_party/composable_kernel_tiled/include/ck_tile/ops/fmha/pipeline/block_fmha_bwd_convert_dq_hip.hpp:8:
/tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/third_party/composable_kernel_tiled/include/ck_tile/ops/fmha/pipeline/block_fmha_bwd_pipeline_default_policy_hip.hpp:596:27: error: constexpr variable 'M0' must be initialized by a constant expression
596 | constexpr index_t M0 = kMPerBlock / (M1 * M2);
| ^ ~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-install-6m4zi23n/xformers_9b8fe45f2094469d8eec921f4d554a45/third_party/composable_kernel_tiled/include/ck_tile/ops/fmha/pipeline/block_fmha_bwd_convert_dq_hip.hpp:56:47: note: in instantiation of function template specialization 'ck_tile::BlockFmhaBwdPipelineDefaultPolicy::MakePostQGradDramTileDistribution<ck_tile::BlockFmhaBwdConvertQGradPipelineProblem<float, unsigned short, 256, 16, 64, 256, false, false, ck_tile::TileFmhaBwdConvertQGradTraits<true, true>>>' requested here
56 | Policy::template MakePostQGradDramTileDistribution<Problem>());
| ^
What I have tried:
install ROCm 6.2.4 -> same issue
Install ROCm 6.3.0 -> same issue
compile the composable kernel from ROCm separately - no error.
I am not sure why the error is on the standard Ubuntu but not on the WSL2 on Windows.
It does not make sense. Does anyone know what can be the issue here ? Seems that there is a division by 0.
The text was updated successfully, but these errors were encountered:
Successfully preprocessed all matching files.
Total number of unsupported CUDA function calls: 0
Total number of replaced kernel launches: 9
running clean
'build/lib.linux-x86_64-cpython-310' does not exist -- can't clean it
'build/bdist.linux-x86_64' does not exist -- can't clean it
'build/scripts-3.10' does not exist -- can't clean it
Failed to build xformers
ERROR: Failed to build installable wheels for some pyproject.toml based projects (xformers)
❓ Questions and Help
I am running one PC with AMD 7900XTX
With dual boot:
Version of Python 3.10.16
The approach is the same:
What I have tried:
I am not sure why the error is on the standard Ubuntu but not on the WSL2 on Windows.
It does not make sense. Does anyone know what can be the issue here ? Seems that there is a division by 0.
The text was updated successfully, but these errors were encountered: