You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I load my model in python, I don't get any error. Even Xcode is able to detect the model metadata. However, when I load the model with CoreML, I get this error:
StatefulPartitionedCall_StatefulPartitionedCall_StatefulPartitionedCall_freq_analyser_StatefulPartitionedCall_multi_conv_1_conv2d_multiconv_same_1_Conv2D_1_cast_fp16_to_fp32 is neither a function input nor is produced by any OperationBuilder in this block..
I tried converting the model using tf v2.12 and tf v2.18, with the same results
The text was updated successfully, but these errors were encountered:
plumenator
added
the
question
Response providing clarification needed. Will not be assigned to a release. (type)
label
Feb 6, 2025
When I load my model in python, I don't get any error. Even Xcode is able to detect the model metadata. However, when I load the model with CoreML, I get this error:
I tried converting the model using tf v2.12 and tf v2.18, with the same results
The text was updated successfully, but these errors were encountered: