You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Im trying the install flash attention following this: https://github.com/Rappsilber-Laboratory/AlphaLink2
I also tried also to install flash-attention from this repo, always get the same error. Using pip i get the subprocess error mentioned: 279
I made sure that the nivida and pytorch version match. I tested on a HPC, Local Computer and on the Cloud with a gpu and I get the same error. I tried to export the path as suggested in the link mentioned above, but it does not work.
../condaenvs/alphalink/lib/python3.10/site-packages/torch/include/ATen/cuda/CUDAContext.h:10:10: fatal error: cusolverDn.h: No such file or directory
10 | #include <cusolverDn.h>
| ^~~~~~~~~~~~~~
compilation terminated.
error: command '/apps/opt/spack/linux-ubuntu20.04-x86_64/gcc-9.3.0/gcc-9.3.0-snvoz3owsjqmblhxyznvhfd75e4uvi6h/bin/gcc' failed with exit code 1
The text was updated successfully, but these errors were encountered:
Yeah idk what's wrong, I've never run into such error. We recommend the Pytorch container from Nvidia, which has all the required tools to install FlashAttention.
Or if you're on torch 2.0+, FlashAttention is available as part of torch.nn.functional.scaled_dot_product_attention.
I think this is related to https://discuss.pytorch.org/t/not-able-to-include-cusolverdn-h/169122, microsoft/DeepSpeed#2684
Im trying the install flash attention following this: https://github.com/Rappsilber-Laboratory/AlphaLink2
I also tried also to install flash-attention from this repo, always get the same error. Using pip i get the subprocess error mentioned: 279
I made sure that the nivida and pytorch version match. I tested on a HPC, Local Computer and on the Cloud with a gpu and I get the same error. I tried to export the path as suggested in the link mentioned above, but it does not work.
The text was updated successfully, but these errors were encountered: