I converted tflite model to tflite model for NPU. With normal tflite model, I can use Tflite lib from Tensorflow. How can I evaluate NPU tflite model on my PC?
Thanks. In the link MCXN947: How to Train and Deploy Customer ML model to NPU - NXP Community in the step 4 "4. Model Evaluation "VALIDATE", it allow valiate tflite model with option of choosing data type.
You can see in the step 5, 6, we convert .h5 model to tflite and tflite to NPU tflite. I do not see about validate NPU tflite in eIQ. If I am wrong, please correct me.
Thanks.
It is difficult to evaluate NPU tflite model on board. Because flash memory is very small and usually dataset has thousands of image, we can not push all images to the board and evaluate. NXP supports NPU tflite, it means that it has implementations of custom operator for NeutronGraph. I don't see any guide. Accuracy evaluation is one of the most key we consider when deploying tflite model to the board beside speed.
Hi @nnxxpp
If you want to test the accuracy of the model, you can use eiq testing. You can refer to this.
MCXN947: How to Train and Deploy Customer ML model to NPU - NXP Community
BR
Harry
Thanks. I expect that you can help me more. I think that you can do it.
Actually I am not sure whether it is worth to evaluate NPU tflite on PC, because NPU is special for NXP board. I know that you custom TFLM to support NPU tflite on board. Could you give me guide to run this custom TFLM on PC. Thank you.
Hi @nnxxpp
Running a custom TensorFlow Lite Micro (TFLM) model designed for NPU acceleration on a PC can be challenging, as TFLM is primarily optimized for embedded systems and custom hardware.
BR
Harry