Reducing the Size of PyTorch-Based Executable #8552
Replies: 1 comment
-
So why would you expect the app to be drastically smaller than that? Most of that size are binary extensions and CUDA/cuDNN shared libs, which cannot be split.
Those two aspects are mutually exclusive. Especially for CUDA-enabled torch builds.
Even onefile build with pip-installed torch2.3+cu118 seems to come out at 2.6GB (and would be awful to unpack on each run, either way). So my guess would be either they are using CPU-only |
Beta Was this translation helpful? Give feedback.
-
Hi,
I'm working on converting a Python program that uses PyTorch to an executable file. However, the folder for the app is too large. I've noticed that the original torch2.3+cu118 in conda is around 5GB, and the torch library in the app is about 4GB.
I'm wondering how to use torch while keeping the executable file small. Specifically, how do others create standalone binary executables that are only 200-300 MB in size?
Any advice or suggestions would be greatly appreciated. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions