Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running on Windows #3

Open
Akrelion45 opened this issue Jun 19, 2024 · 7 comments
Open

Running on Windows #3

Akrelion45 opened this issue Jun 19, 2024 · 7 comments

Comments

@Akrelion45
Copy link

Hey, i tried to run it on windows but flash-attn is not installing on windows.

From what I found in different github repos is, that flash-attn needs a cuda version 12.0 or higher to run on windows systems.

Maybe in the near future, you can update your script to run on a higher cuda version, so we can test it on a windows system too =)

@buaacyw
Copy link
Owner

buaacyw commented Jun 19, 2024

I have tried it on cuda version 12.0 or higher.
Simply change the torch installation script should work.

@Tomobobo710
Copy link

Tomobobo710 commented Jun 19, 2024

Hey, can you be more specific about what we would need to change? I'm not the brightest windows user.

I managed to get an output by disabling flash attention in model_utils.py by setting use_flash_attention_2=False, but I had a lot of errors, and I would like to try your suggestion to change the torch installation script, but I don't know where to look :(.

NVM I got it. After installing nvidia toolkit 12.1 I changed the initial install script's line.

Instead of

pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu118

I did

pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/nightly/cu121

Then waited 45 mins for it to build the wheel and the gui is working in windows!

@Cubey42
Copy link

Cubey42 commented Jun 21, 2024

Hey, can you be more specific about what we would need to change? I'm not the brightest windows user.

I managed to get an output by disabling flash attention in model_utils.py by setting use_flash_attention_2=False, but I had a lot of errors, and I would like to try your suggestion to change the torch installation script, but I don't know where to look :(.

NVM I got it. After installing nvidia toolkit 12.1 I changed the initial install script's line.

Instead of

pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu118

I did

pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/nightly/cu121

Then waited 45 mins for it to build the wheel and the gui is working in windows!

The command you linked gives ERROR: Could not find a version that satisfies the requirement torch==2.1.1 (from versions: 2.2.0.dev20231010+cu121, 2.4.0.dev20240422+cu121, 2.4.0.dev20240423+cu121, 2.4.0.dev20240424+cu121, 2.4.0.dev20240425+cu121, 2.4.0.dev20240426+cu121, 2.4.0.dev20240427+cu121, 2.4.0.dev20240428+cu121, 2.4.0.dev20240429+cu121, 2.4.0.dev20240430+cu121, 2.4.0.dev20240501+cu121, 2.4.0.dev20240502+cu121, 2.4.0.dev20240503+cu121, 2.4.0.dev20240504+cu121, 2.4.0.dev20240505+cu121, 2.4.0.dev20240506+cu121, 2.4.0.dev20240507+cu121, 2.4.0.dev20240508+cu121, 2.4.0.dev20240509+cu121, 2.4.0.dev20240510+cu121, 2.4.0.dev20240511+cu121, 2.4.0.dev20240512+cu121, 2.4.0.dev20240513+cu121, 2.4.0.dev20240514+cu121, 2.4.0.dev20240515+cu121, 2.4.0.dev20240516+cu121, 2.4.0.dev20240517+cu121, 2.4.0.dev20240518+cu121, 2.4.0.dev20240519+cu121, 2.4.0.dev20240520+cu121, 2.4.0.dev20240521+cu121, 2.4.0.dev20240522+cu121, 2.4.0.dev20240523+cu121, 2.4.0.dev20240524+cu121, 2.4.0.dev20240525+cu121, 2.4.0.dev20240526+cu121, 2.4.0.dev20240527+cu121, 2.4.0.dev20240528+cu121, 2.4.0.dev20240530+cu121, 2.4.0.dev20240531+cu121, 2.4.0.dev20240601+cu121, 2.4.0.dev20240604+cu121, 2.4.0.dev20240605+cu121, 2.4.0.dev20240606+cu121, 2.4.0.dev20240607+cu121, 2.4.0.dev20240608+cu121, 2.4.0.dev20240609+cu121, 2.4.0.dev20240610+cu121, 2.4.0.dev20240611+cu121, 2.4.0.dev20240612+cu121, 2.5.0.dev20240613+cu121, 2.5.0.dev20240614+cu121, 2.5.0.dev20240615+cu121, 2.5.0.dev20240616+cu121, 2.5.0.dev20240617+cu121, 2.5.0.dev20240618+cu121, 2.5.0.dev20240620+cu121) ERROR: No matching distribution found for torch==2.1.1
removing the nightly worked but I was still unable to build flash-attn

@adrsch
Copy link

adrsch commented Jun 21, 2024

I've got it working on windows. What I did was instead of using that torch command, as the person above posted. I ran what you get from here: https://pytorch.org/get-started/locally/

Specifically the command is
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia

flash-attn took hours and hours to install for whatever reason but it's working now and I just finished running my first mesh through it.

@adrsch
Copy link

adrsch commented Jun 21, 2024

Also if you're getting anything about 11.8, then you should clear whatever MeshAnything project (dont know what they call that environment in python). Since you want to use 12.1. And make sure your flash-attn is a new version, I don't know what version it gets when you automatically install it but if it's below 2.4 you should install a new one. I'm on 2.5.9.post1. Earlier versions have issues on windows, you can read on their github page.

I got it working perfect but I see now it's really not meant for anything except toy examples. It can't handle any complex meshes like scans or anything you'd use IRL. Look at the cherry picked examples in the pictures they've posted. Wish I had my evening back.

@Cubey42
Copy link

Cubey42 commented Jun 21, 2024

I've got it working on windows. What I did was instead of using that torch command, as the person above posted. I ran what you get from here: https://pytorch.org/get-started/locally/

Specifically the command is conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia

flash-attn took hours and hours to install for whatever reason but it's working now and I just finished running my first mesh through it.

The results are awful though so something may be wrong.

this seems to be building for me, thanks

@SoftologyPro
Copy link

WHL for flash-attn on Windowes here. Needs CUDA12
https://github.com/bdashore3/flash-attention/releases

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants