ONNX Runtime-based inference on i.MX 8M Plus (no execution provider other than cpu?)

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

ONNX Runtime-based inference on i.MX 8M Plus (no execution provider other than cpu?)

Jump to solution
214 Views
cadietrich78
Contributor I

Dear NXP community,

 

I’m trying ONNXRuntime for NPU inference according to the instructions listed in the “i.MX Machine Learning User's Guide” (Chapter 6), but it seems that the only execution provider available is “cpu”. The hardware is the i.MX 8M Plus EVK, running Yocto lf-6.6.52-2.2.1, and the instructions are:

 

(dataset)

$ wget https://github.com/onnx/models/raw/refs/heads/main/validated/vision/classification/mobilenet/model/m...

$ tar -xzvf mobilenetv2-7.tar.gz

(call)

$ /usr/bin/onnxruntime-1.17.1/onnx_test_runner -j 1 -c 1 -r 1 -e cpu ./mobilenetv2-7/

 

I’ve tried acl, armnn, vsi_npu, nnapi, dnnl, rocm, migraphx, xnnpack, qnn, snpe, and coreml as execution providers (all options listed by onnx_test_runner, just to play safe), but the only one that seems to work is “cpu”. Is that the case?

 

Thanks in advance,

Carlos

Labels (1)
0 Kudos
Reply
1 Solution
195 Views
Chavira
NXP TechSupport
NXP TechSupport

Hi @cadietrich78!

Thank you for reaching out to NXP Support!

You're absolutely right, as mentioned in our i.MX Machine Learning User's Guide, ONNX models can currently be executed only on the CPU when using our BSP.

 

Best regards,
Chavira

View solution in original post

0 Kudos
Reply
2 Replies
196 Views
Chavira
NXP TechSupport
NXP TechSupport

Hi @cadietrich78!

Thank you for reaching out to NXP Support!

You're absolutely right, as mentioned in our i.MX Machine Learning User's Guide, ONNX models can currently be executed only on the CPU when using our BSP.

 

Best regards,
Chavira

0 Kudos
Reply
191 Views
cadietrich78
Contributor I
Thanks Chavira for the prompt answer. If I may, since ONNX Runtime is not available, what would be the best option for a quick inference test using C++ and YOLO-based models?
0 Kudos
Reply