How to using NPU running onnx in IMX8MPLUS-BB

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

How to using NPU running onnx in IMX8MPLUS-BB

362 Views
dongyang12
Contributor I

Hi,

I try running onnx model using NPU in IMX8MPPLUS-BB. Download the image "Real-time_Edge_v2.5_IMX8MP-LPDDR4-EVK.zip"from office website. But I find this image didn't include onnxtime or tensorflow in /usr/bin. Could you tell me how to build an image with Machine Learning and How to build a toolchain with onnxtime and tensorlfow-lite lib.

I also what to know how to run onnx model using NPU in IMX8MPPLUS-BB.

0 Kudos
7 Replies

335 Views
Chavira
NXP TechSupport
NXP TechSupport

Hi @dongyang12!

Thank you for contacting NXP Support!

 

You can refer to our i.MX Machine Learning User's Guide, but You should use the full image to get access to machine learning packages.

 

https://www.nxp.com/docs/en/user-guide/IMX-MACHINE-LEARNING-UG.pdf

 

Best Regards.

Chavira

0 Kudos

327 Views
dongyang12
Contributor I

Hi, Chavira

Could you tell me how to get the full image or how to build the full image? 

0 Kudos

319 Views
Chavira
NXP TechSupport
NXP TechSupport

Hi @dongyang12!

You can download the software for iMX8MP by following the link:

https://www.nxp.com/design/design-center/software/embedded-software/i-mx-software/embedded-linux-for...

Best Regards!

Chavira

0 Kudos

246 Views
dongyang12
Contributor I

@Chavira , Thanks your information.

After study file "IMX-MACHINE-LEARNING-UG.pdf", I have some question.

1, From below picture, It seems ONNX only support running in CPU on NXP? Does ONNX support running in NPU? Which EPs are supported?

捕获1.PNG

2, Build file C_Api_Sample.cpp, I need cross-compiling tool, How to generate this toolchain?

捕获.PNG

0 Kudos

229 Views
Chavira
NXP TechSupport
NXP TechSupport

Hi @dongyang12!

 

1 >> Yes, Onxx only runs on the CPU.

 

Chavira_0-1712586152294.png

 

 

2 >> You have to follow the next steps to generate the toolchain:

 

Install the repo utility:

 

$: mkdir ~/bin

$: curl http://commondatastorage.googleapis.com/git-repo-downloads/repo  > ~/bin/repo

$: chmod a+x ~/bin/repo

$: PATH=${PATH}:~/bin

 

Download the Yocto Project BSP

 

$: mkdir <release>

$: cd <release>

$: repo init -u https://github.com/nxp-imx/imx-manifest -b <branch name> [ -m <release manifest>]

$: repo sync

 

Setup the build folder for a BSP release:

 

$: [MACHINE=<machine>] [DISTRO=fsl-imx-<backend>] source ./imx-setup-release.sh -b bld-<backend>

 

<machine> defaults to `imx6qsabresd`

<backend> Graphics backend type xwayland   

Wayland with X11 support - default distro 

wayland  Wayland

fb     Framebuffer (not supported for mx8)

 

Build SDK:

 

bitbake -c populate_sdk imx-image-full

 

Best Regards!

Chavira

0 Kudos

164 Views
dongyang12
Contributor I

@Chavira Thanks, I can generate the toolchain.

And I have a question, I get the information from the relase note in file "IMX-MACHINE-LEARNING-UG.pdf". Onnx can run VSI-NPU Execution provider and NNAPI Execution Provider in the previous version. But in the latest version, VSI-NPU and NNAPI been remove. Could you tell me why they been removed. Is it because of poor performance?

0 Kudos

26 Views
dongyang12
Contributor I

@Chavira  Could you answer this question?

And I have a question, I get the information from the relase note in file "IMX-MACHINE-LEARNING-UG.pdf". Onnx can run VSI-NPU Execution provider and NNAPI Execution Provider in the previous version. But in the latest version, VSI-NPU and NNAPI been remove. Could you tell me why they been removed. Is it because of poor performance?

0 Kudos