Can I run XNNPACK backend for TFLM on MIMXRT1060-EVKB?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Can I run XNNPACK backend for TFLM on MIMXRT1060-EVKB?

778 Views
nnxxpp
Contributor IV

I use tflm_cifar10 project (from MCUXPresso SDK) as a base project for other model. I am using MCUXpresso IDE to deploy Machine Learning model to the MIMXRT1060-EVKB board. Can I run XNNPACK backend for TFLM on MIMXRT1060-EVKB to accelerate sparse model? If I can, please help me steps to do this in MCUXPresso IDE. Where can I see Make file for this project? Thank you.

Tags (2)
0 Kudos
Reply
1 Reply

753 Views
Gavin_Jia
NXP TechSupport
NXP TechSupport

Hi @nnxxpp ,

Thanks for your interest in NXP MIMXRT series!

There is no official documentation for the RT series to support this usage. But as far as I know, there are some MPUs in the I.MX series that support it. You can refer to the rest of the MPUs.

https://community.nxp.com/t5/i-MX-Processors/tflite-XNNPACK-delegate-for-inference-on-quantized-netw...

Sorry for the inconvenience.

Best regards,
Gavin

0 Kudos
Reply