Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Would the 5b_lyrics model run on a RTX 3090? #150

Open
Flesco opened this issue Sep 24, 2020 · 8 comments
Open

Would the 5b_lyrics model run on a RTX 3090? #150

Flesco opened this issue Sep 24, 2020 · 8 comments

Comments

@Flesco
Copy link

Flesco commented Sep 24, 2020

RTX 3090 has 24gb memory. This would be enough to run the 5b model, correct?

@johndpope
Copy link
Contributor

I don't think it will - not even 2 cards will help. Seems like it's in the realm of rtx 8000 exclusively. happy to be corrected here. #142

@AeroScripts
Copy link

It runs (inference) on my 3090 without issue, haven't tried training though.

keep in mind pytorch hasn't been updated to the new cuda yet so you have to compile it yourself

@Flesco
Copy link
Author

Flesco commented Oct 2, 2020

It runs (inference) on my 3090 without issue, haven't tried training though.

keep in mind pytorch hasn't been updated to the new cuda yet so you have to compile it yourself

Good to know, thanks for sharing! And please let me know if training works, if you attempt it.

@AeroScripts
Copy link

AeroScripts commented Oct 3, 2020

Also worth mentioning my system memory (32gb) can just barely handle it. During init, it fills up fully and has to swap about 2gb, but once its loaded that goes down to about 20gb usage

@Flesco
Copy link
Author

Flesco commented Oct 7, 2020

One more question @AeroScripts - how fast is it? For example, how long does it take to generate 20 seconds of music?

@Randy-H0
Copy link

Randy-H0 commented Nov 12, 2020

One more question @AeroScripts - how fast is it? For example, how long does it take to generate 20 seconds of music?

I can say that on the google colab version it takes about a minute for a second not upsampled.
It takes 3-5 hours for 60 seconds for 1 file upsampled.

The RTX 3090 is still really good but it will really not go that fast as the GPU's google are using.

@moih
Copy link

moih commented Feb 15, 2021

@Randy1435 @AeroScripts Have you managed to train a Small Prior using the RTX 3090?

I have been trying but get a cuDNN error (NOT an OOM); I'd really appreciate it if you could pass on a fix for it? Thanks!

@lucaguarro
Copy link

Any update on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants