Convert settings for preprocess and forward on ONNX #1829
Unanswered
nagaokatakuya
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I would like to know the convert settings for completing inference with ONNX when converting the classification(e.g. resnet18) models to ONNX format using MMDeploy.
My understanding of the design of MMDeploy is
SDK is responsible for preprocess and postprocess, and models converted in various formats
such as ONNX and OpenVINO are recognized only for forward.
https://mmdeploy.readthedocs.io/en/latest/get_started.html#inference-sdk
i.e.
is. On the other hand, what we want to achieve is to incorporate preprocess into the model as well.
That is, as the specification of inferene script is below.
To achieve this
I would like to be able to handle 3,4 processes within the ONNX model.
Regarding postprocess, it is assumed that it will be incorporated in the web API
that we are planning to implement independently, so I would like to include preprocess and forward in ONNX.
Reading the mmengine preprocess specification, I think it is possible because it is written in torch. thank you.
ref: https://mmclassification.readthedocs.io/en/dev-1.x/api/data_process.html#data-preprocessors
Beta Was this translation helpful? Give feedback.
All reactions