-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Install flash-attention failed #80
Comments
Which branch are you using? For Navi31, you might want to follow this thread: Or use a Triton-based implementation. |
both navi_support and main have tried |
I tried these steps in WSL, and it compiled successfully: git clone https://github.com/ROCm/flash-attention
cd flash-attention
git checkout howiejay/navi_support
python3 -m venv venv
source venv/bin/activate
pip3 install cmake ninja wheel packaging
# install a working torch, a custom one for wsl is used in my case
pip3 install torch --index-url https://download.pytorch.org/whl/rocm6.1
python3 setup.py bdist_wheel
# and you get a `flash_attn*.whl` in `dist/` |
I tried it too, but also the Error is as the same as I said "use python setup.py install" on the top |
Could you provide a complete log? |
thank you |
Seems like a network problem? |
thanks a lot. I have change to the best proxy, if still is the network problem, I have no way.o(╥﹏╥)o |
You should fix this. |
also the same error. |
Run these steps from scratch (new shell and new location): |
It seems worth noting that flash-attention version is 2.0.4 |
Facing the same issue: I think howiejay/navi_support version is not compatible with ROCM 6.2 (or the Pytorch version 2.5.0.dev20240908+rocm6.2) which are what I run now. Another thing I noticed is ROCM 6.2 needs Pytorch nightly (meant for ROCM 6.2), older Pytorch versions are causing some issues. Pytorch I'm using: |
Yes, I firstly try ROCM6.2 and ubuntu 24.4, and I also face the same issue. So I use ROCM6.1 and ubuntu 22.4, but still have problem. And until now, it still does not be solveed |
I install the flash-attention and follow this link: https://rocm.blogs.amd.com/artificial-intelligence/flash-attention/README.html
my GPU is gtx1100(7900XTX)
I install it in the docker and the docker's Installation follows this link:https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/wsl/install-pytorch.html
Here is the Error info:
Failed to build flash_attn
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash_attn)
And if I use python setup.py install
Here is the Error info:
File "/download/flash-attention/setup.py", line 490, in
setup(
File "/usr/lib/python3/dist-packages/setuptools/init.py", line 153, in setup
return distutils.core.setup(**attrs)
File "/usr/lib/python3.10/distutils/core.py", line 148, in setup
dist.run_commands()
File "/usr/lib/python3.10/distutils/dist.py", line 966, in run_commands
self.run_command(cmd)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/install.py", line 74, in run
self.do_egg_install()
File "/usr/lib/python3/dist-packages/setuptools/command/install.py", line 116, in do_egg_install
self.run_command('bdist_egg')
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/bdist_egg.py", line 164, in run
cmd = self.call_command('install_lib', warn_dir=0)
File "/usr/lib/python3/dist-packages/setuptools/command/bdist_egg.py", line 150, in call_command
self.run_command(cmdname)
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/install_lib.py", line 23, in run
self.build()
File "/usr/lib/python3.10/distutils/command/install_lib.py", line 109, in build
self.run_command('build_ext')
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 79, in run
_build_ext.run(self)
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 340, in run
self.build_extensions()
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 866, in build_extensions
build_ext.build_extensions(self)
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 449, in build_extensions
self._build_extensions_serial()
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 474, in _build_extensions_serial
self.build_extension(ext)
File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 202, in build_extension
_build_ext.build_extension(self, ext)
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 529, in build_extension
objects = self.compiler.compile(sources,
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 679, in unix_wrap_ninja_compile
_write_ninja_file_and_compile_objects(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 1785, in _write_ninja_file_and_compile_objects
_run_ninja_build(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 2121, in _run_ninja_build
raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension
The text was updated successfully, but these errors were encountered: