You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to use the quantized models from the kokoro ONNX repo for inference in the SwiftUI example. My process is as follows:
I take the quantized model and add metadata using the sherpa-onnx/scripts/kokoro/add-meta-data.py script.
I place the modified model into the SwiftUI repo.
I attempt to run inference but encounter the following error: libc++abi: terminating due to uncaught exception of type Ort::Exception: Failed to load model with error: /Users/runner/work/onnxruntime-libs/onnxruntime-libs/onnxruntime/core/graph/model_load_utils.h:56 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::string, int> &, const logging::Logger &, bool, const std::string &, int) ONNX Runtime only *guarantees* support for models stamped with official released onnx opset versions. Opset 5 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 4.
I'm not too familiar with ONNX, so I'm unsure how to address this error. Is there a way to modify or convert the quantized model to ensure compatibility with ONNX Runtime? Or can we change the version of the ONNX runtime?
Also, I am able to run inference using the quantized model in the original kokoro ONNX repo.
Any guidance on how to resolve this issue would be greatly appreciated!
The text was updated successfully, but these errors were encountered:
I'm trying to use the quantized models from the kokoro ONNX repo for inference in the SwiftUI example. My process is as follows:
sherpa-onnx/scripts/kokoro/add-meta-data.py
script.libc++abi: terminating due to uncaught exception of type Ort::Exception: Failed to load model with error: /Users/runner/work/onnxruntime-libs/onnxruntime-libs/onnxruntime/core/graph/model_load_utils.h:56 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::string, int> &, const logging::Logger &, bool, const std::string &, int) ONNX Runtime only *guarantees* support for models stamped with official released onnx opset versions. Opset 5 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 4.
I'm not too familiar with ONNX, so I'm unsure how to address this error. Is there a way to modify or convert the quantized model to ensure compatibility with ONNX Runtime? Or can we change the version of the ONNX runtime?
Also, I am able to run inference using the quantized model in the original kokoro ONNX repo.
Any guidance on how to resolve this issue would be greatly appreciated!
The text was updated successfully, but these errors were encountered: