We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
环境
问题描述
在使用onnxrunime推理的是报错,错误信息如下:
return self._sess.run(output_names, input_feed, run_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running BatchNormalization node. Name:'BatchNormalization.2' Status Message: Invalid input scale: 0th dimension != 279
更多信息 :
导出模型的命令:
paddle2onnx --model_dir models/inference/ \ --model_filename model.pdmodel \ --params_filename model.pdiparams \ --save_file models/inference/model.onnx \ --opset_version 16
使用PaddleInference推理正常。
使用到的BatchNorm有三个:
self.bn = nn.BatchNorm2D(num_channels_out, data_format='NCHW') ······ self.bn = nn.BatchNorm1D(rnn_layer_size * 2, data_format='NLC') ······ self.bn = nn.BatchNorm1D(hidden_size, data_format='NLC')
The text was updated successfully, but these errors were encountered:
模型方便放出来一下嘛?
Sorry, something went wrong.
PaddleInference 模型
@Zheng-Bicheng 可以的可以的,网盘里包含了PaddleInference 模型和导出的ONNX模型
模型地址:百度网盘
Zheng-Bicheng
No branches or pull requests
环境
问题描述
在使用onnxrunime推理的是报错,错误信息如下:
更多信息 :
导出模型的命令:
使用PaddleInference推理正常。
使用到的BatchNorm有三个:
The text was updated successfully, but these errors were encountered: