Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there support for running the model on multiple GPUs? #55

Closed
dmakwana opened this issue Jul 25, 2024 · 6 comments
Closed

Is there support for running the model on multiple GPUs? #55

dmakwana opened this issue Jul 25, 2024 · 6 comments

Comments

@dmakwana
Copy link

As described here: https://huggingface.co/docs/diffusers/en/training/distributed_inference#pytorch-distributed

Has it been tested? I'm wondering what the best way to do this is. Any suggestions / pointers would be greatly appreciated.

cc: @wrose100

@gabriel-piles
Copy link
Member

gabriel-piles commented Jul 25, 2024

Hello, thanks for your interest in the project.

This model run on a single GPU with less than 8GB of memory, making it suitable for most standard use cases. If you're looking to process a high volume of PDFs, we recommend exploring parallel processing using multiple Docker containers.

To provide the most effective solution, please share more details about your specific use case, including the expected PDF volume, processing requirements, and desired throughput.

We're happy to assist you in finding the optimal configuration for your needs.

@rogoit
Copy link

rogoit commented Aug 6, 2024

Maybe this is related on a ubuntu linux system

Error response from daemon: could not select device driver "nvidia" with capabilities: [[gpu]]

@gabriel-piles
Copy link
Member

@rogoit

Thank you for your question!

It's possible that the issue might be related to the NVIDIA Container Toolkit. If you haven't already, you can install it by following the official NVIDIA guide: link to NVIDIA Container Toolkit installation guide. Alternatively, if you have the toolkit installed, reinstalling it might resolve any potential conflicts.  

Let me know if you try either of these suggestions and still encounter problems.

@rogoit
Copy link

rogoit commented Sep 6, 2024

Hi @gabriel-piles ,

thx for response. The thing is that we will not have invidia drivers or any kind of grafic interface in our it infrastructure for the project or on local environments. So why do you need this?

See you
Roland

@gabriel-piles
Copy link
Member

hi @rogoit,

In that case, start the docker container with:

make start_no_gpu

Please let us know if you have any further questions.

@rogoit
Copy link

rogoit commented Sep 6, 2024

Ok, i will check on weekend I hope. Thx for your passion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants