We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there any way to train with multiple GPUs? The current scripts seem to only utilize the first GPU, altough TF 'sees' all of them.
The text was updated successfully, but these errors were encountered:
Unfortunately, I haven't yet played with the multi-tower training stuff as I only have a humble little 970 at the moment. Pull requests are welcomed!
Sorry, something went wrong.
ok, thanks. I am quite new to TF though - will take a look.
I'll leave this issue open in case someone implemented it in one of their forks and sees this.
No branches or pull requests
Is there any way to train with multiple GPUs?
The current scripts seem to only utilize the first GPU, altough TF 'sees' all of them.
The text was updated successfully, but these errors were encountered: