Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: libtorch_cuda_cpp.so: cannot open shared object file: No such file or directory #1

Open
jxxtin opened this issue Jun 18, 2024 · 3 comments

Comments

@jxxtin
Copy link

jxxtin commented Jun 18, 2024

Thank you for the great work. Could you please check the following error that occurred while executing the app.py?

Traceback (most recent call last):
  File "/root/projects/MeshAnything/app.py", line 8, in <module>
    from main import get_args, load_model
  File "/root/projects/MeshAnything/main.py", line 6, in <module>
    from MeshAnything.models.meshanything import MeshAnything
  File "/root/projects/MeshAnything/MeshAnything/models/meshanything.py", line 5, in <module>
    from MeshAnything.models.shape_opt import ShapeOPTConfig
  File "/root/projects/MeshAnything/MeshAnything/models/shape_opt.py", line 2, in <module>
    from transformers.models.opt.modeling_opt import OPTForCausalLM, OPTModel, OPTDecoder, OPTLearnedPositionalEmbedding, OPTDecoderLayer
  File "/root/anaconda3/envs/meshany/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py", line 46, in <module>
    from flash_attn import flash_attn_func, flash_attn_varlen_func
  File "/root/anaconda3/envs/meshany/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module>
    from flash_attn.flash_attn_interface import (
  File "/root/anaconda3/envs/meshany/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
    import flash_attn_2_cuda as flash_attn_cuda
ImportError: libtorch_cuda_cpp.so: cannot open shared object file: No such file or directory
@buaacyw
Copy link
Owner

buaacyw commented Jun 18, 2024

Hi! Thanks for your interest.
It looks like you installed a wrong torch or cuda version.
Could you please show me your cuda version with nvcc -V?

@jxxtin
Copy link
Author

jxxtin commented Jun 18, 2024

Thanks for quick reply @buaacyw

My nvcc version is 11.7 and pytorch is 2.0.1+cu117 as follow

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Jun__8_16:49:14_PDT_2022
Cuda compilation tools, release 11.7, V11.7.99
Build cuda_11.7.r11.7/compiler.31442593_0

import torch
torch.>>> torch.__version__
'2.0.1+cu117'

@buaacyw
Copy link
Owner

buaacyw commented Jun 18, 2024

Hi! I haven't tried flash-attn on CUDA11.7 but I think it should work for 11.7.
I suggest you ask about this in the flash-attn repo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants