Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install flash-attention failed #80

Open
sdfasfsdfasfasafd opened this issue Sep 4, 2024 · 15 comments
Open

Install flash-attention failed #80

sdfasfsdfasfasafd opened this issue Sep 4, 2024 · 15 comments

Comments

@sdfasfsdfasfasafd
Copy link

sdfasfsdfasfasafd commented Sep 4, 2024

I install the flash-attention and follow this link: https://rocm.blogs.amd.com/artificial-intelligence/flash-attention/README.html

my GPU is gtx1100(7900XTX)

I install it in the docker and the docker's Installation follows this link:https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/wsl/install-pytorch.html

Here is the Error info:

Failed to build flash_attn
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash_attn)

And if I use python setup.py install

Here is the Error info:
File "/download/flash-attention/setup.py", line 490, in
setup(
File "/usr/lib/python3/dist-packages/setuptools/init.py", line 153, in setup
return distutils.core.setup(**attrs)
File "/usr/lib/python3.10/distutils/core.py", line 148, in setup
dist.run_commands()
File "/usr/lib/python3.10/distutils/dist.py", line 966, in run_commands
self.run_command(cmd)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/install.py", line 74, in run
self.do_egg_install()
File "/usr/lib/python3/dist-packages/setuptools/command/install.py", line 116, in do_egg_install
self.run_command('bdist_egg')
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/bdist_egg.py", line 164, in run
cmd = self.call_command('install_lib', warn_dir=0)
File "/usr/lib/python3/dist-packages/setuptools/command/bdist_egg.py", line 150, in call_command
self.run_command(cmdname)
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/install_lib.py", line 23, in run
self.build()
File "/usr/lib/python3.10/distutils/command/install_lib.py", line 109, in build
self.run_command('build_ext')
File "/usr/lib/python3.10/distutils/cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "/usr/lib/python3.10/distutils/dist.py", line 985, in run_command
cmd_obj.run()
File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 79, in run
_build_ext.run(self)
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 340, in run
self.build_extensions()
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 866, in build_extensions
build_ext.build_extensions(self)
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 449, in build_extensions
self._build_extensions_serial()
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 474, in _build_extensions_serial
self.build_extension(ext)
File "/usr/lib/python3/dist-packages/setuptools/command/build_ext.py", line 202, in build_extension
_build_ext.build_extension(self, ext)
File "/usr/lib/python3.10/distutils/command/build_ext.py", line 529, in build_extension
objects = self.compiler.compile(sources,
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 679, in unix_wrap_ninja_compile
_write_ninja_file_and_compile_objects(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 1785, in _write_ninja_file_and_compile_objects
_run_ninja_build(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py", line 2121, in _run_ninja_build
raise RuntimeError(message) from e
RuntimeError: Error compiling objects for extension

@evshiron
Copy link

evshiron commented Sep 4, 2024

Which branch are you using?

For Navi31, you might want to follow this thread:

Or use a Triton-based implementation.

@sdfasfsdfasfasafd
Copy link
Author

Which branch are you using?

For Navi31, you might want to follow this thread:

Or use a Triton-based implementation.

both navi_support and main have tried

@evshiron
Copy link

evshiron commented Sep 4, 2024

I tried these steps in WSL, and it compiled successfully:

git clone https://github.com/ROCm/flash-attention
cd flash-attention
git checkout howiejay/navi_support

python3 -m venv venv
source venv/bin/activate

pip3 install cmake ninja wheel packaging
# install a working torch, a custom one for wsl is used in my case
pip3 install torch --index-url https://download.pytorch.org/whl/rocm6.1

python3 setup.py bdist_wheel
# and you get a `flash_attn*.whl` in `dist/`

@sdfasfsdfasfasafd
Copy link
Author

I tried these steps in WSL, and it compiled successfully:

git clone https://github.com/ROCm/flash-attention
cd flash-attention
git checkout howiejay/navi_support

python3 -m venv venv
source venv/bin/activate

pip3 install cmake ninja wheel packaging
# install a working torch, a custom one for wsl is used in my case
pip3 install torch --index-url https://download.pytorch.org/whl/rocm6.1

python3 setup.py bdist_wheel
# and you get a `flash_attn*.whl` in `dist/`

I tried it too, but also the Error is as the same as I said "use python setup.py install" on the top

@evshiron
Copy link

evshiron commented Sep 4, 2024

Could you provide a complete log?

@sdfasfsdfasfasafd
Copy link
Author

Could you provide a complete log?

install.log

thank you

@evshiron
Copy link

evshiron commented Sep 4, 2024

Seems like a network problem?

@sdfasfsdfasfasafd
Copy link
Author

Seems like a network problem?

thanks a lot. I have change to the best proxy, if still is the network problem, I have no way.o(╥﹏╥)o

@evshiron
Copy link

evshiron commented Sep 4, 2024

Submodule 'csrc/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'csrc/composable_kernel'
Cloning into '/download/flash-attention/csrc/composable_kernel'...
fatal: unable to access 'https://github.com/ROCm/composable_kernel.git/': Could not resolve host: github.com
fatal: clone of 'https://github.com/ROCm/composable_kernel.git' into submodule path '/download/flash-attention/csrc/composable_kernel' failed

You should fix this.

@sdfasfsdfasfasafd
Copy link
Author

python3 setup.py bdist_wheel

also the same error.
install1.log

@evshiron
Copy link

evshiron commented Sep 4, 2024

Run these steps from scratch (new shell and new location):

@nktice
Copy link

nktice commented Sep 4, 2024

Run these steps from scratch (new shell and new location):

* [Install flash-attention failed #80 (comment)](https://github.com/ROCm/flash-attention/issues/80#issuecomment-2328438457)

It seems worth noting that flash-attention version is 2.0.4
a version so old that's basically useless for lots of programs,
as that one predates a lot of major changes newer software needs.

@sancspro
Copy link

sancspro commented Sep 11, 2024

Facing the same issue:
RuntimeError: Error compiling objects for extension

I think howiejay/navi_support version is not compatible with ROCM 6.2 (or the Pytorch version 2.5.0.dev20240908+rocm6.2) which are what I run now.

Another thing I noticed is ROCM 6.2 needs Pytorch nightly (meant for ROCM 6.2), older Pytorch versions are causing some issues. Pytorch I'm using:
pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.2/

@sdfasfsdfasfasafd
Copy link
Author

sdfasfsdfasfasafd commented Sep 12, 2024

Facing the same issue: RuntimeError: Error compiling objects for extension

I think howiejay/navi_support version is not compatible with ROCM 6.2 (or the Pytorch version 2.5.0.dev20240908+rocm6.2) which are what I run now.

Another thing I noticed is ROCM 6.2 needs Pytorch nightly (meant for ROCM 6.2), older Pytorch versions are causing some issues. Pytorch I'm using: pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm6.2/

Yes, I firstly try ROCM6.2 and ubuntu 24.4, and I also face the same issue. So I use ROCM6.1 and ubuntu 22.4, but still have problem. And until now, it still does not be solveed

@githust66
Copy link

Which branch are you using?

For Navi31, you might want to follow this thread:

Or use a Triton-based implementation.

torch.version = 2.5.1+rocm6.2

Follow this to report the error
Traceback (most recent call last): File "/root/flash-attention/setup.py", line 331, in <module> validate_and_update_archs(archs) File "/root/flash-attention/setup.py", line 131, in validate_and_update_archs assert all( AssertionError: One of GPU archs of ['gfx1100'] is invalid or not supported by Flash-Attention
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants