onnxruntime using NPU on i.MX 8M-Plus

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

onnxruntime using NPU on i.MX 8M-Plus

2,159 Views
Eric97
Contributor II

Hello,

 

I installed onnxruntime 1.10.0 on i.MX 8M-Plus. I'd like to executes inference on the NPU using Python. I see that the available providers are 'NnapiExecutionProvider', 'VsiNpuExecutionProvider', 'CPUExecutionProvider'. As follows:

Python 3.9.4 (default, Apr  4 2021, 18:23:51) 
[GCC 10.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime as rt
>>> rt.get_available_providers()
['NnapiExecutionProvider', 'VsiNpuExecutionProvider', 'CPUExecutionProvider']

When I use 'VsiNpuExecutionProvider', we get an error like: RuntimeError: Unknown Provider Type: VsiNpuExecutionProvider.

The example using " /usr/bin/onnxruntime-1.10.0/onnx_test_runner" with vsi_npu works well. How can I use VsiNpuExecutionProvider in python correctly?

0 Kudos
Reply
3 Replies

1,983 Views
ksj
Contributor II

Exact same problem here.

In case of the C++ API, the provider name 'vsi-npu' would be applied. but what about Python API? What is the exact name that it has been defined on the API?

0 Kudos
Reply

2,140 Views
Eric97
Contributor II

I know the VSI NPU execution provider has limitations, but in my case I can't find VSI NPU execution provider. I would like to know how I can find and run it in Python. Thank you.

0 Kudos
Reply