You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hiya Xiaotongnii,
Yes, if you want to do multithread inference you can set it in the NeonBackendModelContext by changing the number of threads from the default 0 to whatever you want it to use. This will be passed down into the Neon backend and used wherever possible to multithread the ops.
NeonBackend how to support multi thread inference。
support multi thread parallel in Neon operator,like onnxruntime IntraOpNumThreads;
Or Others
The text was updated successfully, but these errors were encountered: