2

I am trying to check if my .onnx model is correct, and need to run inference to verify the output for the same.

I know we can run validation on .mlmodel using coremltools in Python - basically load the model and input and get the prediction. I am trying to do a similar thing for the .onnx model.

I found the MXNet framework but I can't seem to understand how to import the model - I just have the .onnx file and MXNet requires some extra input besides the onnx model.

Is there any other simple way to do this in Python? I am guessing this is a common problem but can't seem to find any relevant libraries/frameworks to do this as easily as coremltools for .mlmodel.

I do not wish to convert .onnx to another type of model (like say PyTorch) as I want to check the .onnx model as is, not worrying if the conversion was correct. Just need a way to load the model and input, run inference and print the output.

This is my first time encountering these formats, so any help or insight would be appreciated.

Thanks!

1 Answer 1

8

I figured out a way to do this using Caffe2 - just posting in case someone in the future tries to do the same thing.

The main code snippet is:

import onnx
import caffe2.python.onnx.backend
from caffe2.python import core, workspace

import numpy as np

# make input Numpy array of correct dimensions and type as required by the model

modelFile = onnx.load('model.onnx')
output = caffe2.python.onnx.backend.run_model(modelFile, inputArray.astype(np.float32))

Also it is important to note that the input to run_model can only be a numpy array or a string. The output will be an object of the Backend.Outputs type. I was able to extract the output numpy array from it.

I was able to execute inference on the CPU, and hence did not need the Caffe2 installation with GPU (requiring CUDA and CDNN).

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.