This lab will cover how to take an existing TensorFlow image classification model named Mobilenet, and re-train it to categorize images of flowers. This is known as transfer learning. This updated model will then be converted into a TensorFlow Lite file. By using that file with the TensorFlow Lite inference engine that is part of NXPs eIQ package, the model can be ran on an i.MX RT embedded device. A camera attached to the board can then be used to look at photos of flowers and the model will determine what type of flowers the camera is looking at. These same steps could then be used for classifying other types of images too.
This lab can also be used without a camera+LCD, but in that scenario the flowers images will need to be converted to a C array and loaded at compile time.
Attached to this post you will find:
- If have camera+LCD use: eIQ TensorFlow Lite Transfer Learning Lab - With Camera.pdf
- If do not have camera or LCD use: eIQ TensorFlow Lite Transfer Learning Lab - Without Camera.pdf
A video going through this lab is available as well.
This lab supports the following boards:
i.MXRT1050-EVKB
i.MXRT1060-EVK
i.MXRT1064-EVK
Updated July 2020 for 2.8.0 MCUXpresso SDK