-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Would the 5b_lyrics model run on a RTX 3090? #150
Comments
I don't think it will - not even 2 cards will help. Seems like it's in the realm of rtx 8000 exclusively. happy to be corrected here. #142 |
It runs (inference) on my 3090 without issue, haven't tried training though. keep in mind pytorch hasn't been updated to the new cuda yet so you have to compile it yourself |
Good to know, thanks for sharing! And please let me know if training works, if you attempt it. |
Also worth mentioning my system memory (32gb) can just barely handle it. During init, it fills up fully and has to swap about 2gb, but once its loaded that goes down to about 20gb usage |
One more question @AeroScripts - how fast is it? For example, how long does it take to generate 20 seconds of music? |
I can say that on the google colab version it takes about a minute for a second not upsampled. The RTX 3090 is still really good but it will really not go that fast as the GPU's google are using. |
@Randy1435 @AeroScripts Have you managed to train a Small Prior using the RTX 3090? I have been trying but get a cuDNN error (NOT an OOM); I'd really appreciate it if you could pass on a fix for it? Thanks! |
Any update on this? |
RTX 3090 has 24gb memory. This would be enough to run the 5b model, correct?
The text was updated successfully, but these errors were encountered: