eIQ® is comprised of multiple pieces of hardware and software to enable users to run machine learning models on embedded devices. Some of the key pieces of eIQ enablement for NXP microcontrollers include:
- eIQ Time Series Studio - PC tool to create and deploy classical machine learning and neural network models for time series analysis
- eIQ Inference Engines - Included as part of MCUXpresso SDK or Yocto Linux, these are used to do inferencing of pre-trained models on embedded devices. Options for MCUs include TensorFlow Lite, and coming soon, ExecuTorch.
- eIQ Neutron NPU – Accelerator core architecture embedded inside specific NXP devices like MCX N and i.MX RT700 to accelerate the inference of neural network models
- eIQ Toolkit - Contains the Neutron Converter tool for enabling models to be accelerated with eIQ Neutron NPUs
- eIQ Model Zoo - browse models tested on NXP silicon
- eIQ Model Watermarking Extension - Enhance copyright protections on custom models
- eIQ Model Creator - Partnership with ModelCat for vision-based model development
There are two main paths for using AI/ML on NXP devices.
- Use eIQ Time Series Studio to generate and deploy a model for time series applications
- Use eIQ inference engines to deploy a pre-trained neural network model
This attached lab will cover the 2nd option by showing how to run the TensorFlow Lite for Microcontrollers (TFLM) inference engine examples found in the MCUXpresso SDK. TFLM support can be found for the following NXP devices in MCUXpresso SDK:
- MCX N
- i.MX RT500
- i.MX RT600
- i.MX RT700
- i.MX RT1050
- i.MX RT1060
- i.MX RT1064
- i.MX RT1160
- i.MX RT1170
- i.MX RT1180
Full details on how to download eIQ inference engine software libraries and run it with VS Code, MCUXpresso IDE, IAR, Keil MDK, or ARM GCC can be found in the attached Getting Started guide.
For more information about eIQ and some hands-on labs for the i.MX RT family, see the following links:
eIQ Software for MCUs:
MCU Hands-on Labs
Resources:
eIQ App Notes