eIQ CMSIS-NN Porting Guide for MCUs

cancel
Showing results for 
Search instead for 
Did you mean: 

eIQ CMSIS-NN Porting Guide for MCUs

eIQ CMSIS-NN Porting Guide for MCUs

The eIQ CMSIS-NN software for i.MX RT devices that is found in the MCUXPresso SDK package can be ported to other microcontroller devices in the RT family, as well as to some LPC and Kinetis devices. 

A very common question is what processors support inferencing of models, and the answer is that inferencing simply means doing millions of multiple and accumulate math calculations – the dominant operation when processing any neural network -, which almost any MCU or MPU is capable of. There’s no special hardware or module required to do inferencing. However high core clock speeds, and fast memory can drastically reduce inference time. Determining if a particular model can run on a specific device is based on:

  • How long will it take the inference to run. The same model will take much longer to run on less powerful devices. The maximum acceptable inference time is dependent on your particular application and the particular model. 
  • Is there enough non-volatile memory to store the weights, the model itself, and the inference engine
  • Is there enough RAM to keep track of the intermediate calculations and output

The attached guide walks through how to port the CMSIS-NN inference engine to the LPC55S69 family. Similar steps can be done to port eIQ to other microcontroller devices. This guide is made available as a reference for users interested in exploring eIQ on other devices, however only the RT1050 and RT1060 are officially supported at this time for CMSIS-NN for MCUs as part of eIQ. 

These other eIQ porting guides might also be of interest:
Glow Porting Guide for MCUs
TensorFlow Lite Porting Guide for RT685

Labels (1)
Attachments
Version history
Revision #:
3 of 3
Last update:
4 weeks ago
Updated by:
NXP Employee
 
Contributors