How can I evaluate NPU tflite model on my PC?

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

How can I evaluate NPU tflite model on my PC?

2,014 次查看
nnxxpp
Contributor IV

I converted tflite model to tflite model for NPU. With normal tflite model, I can use Tflite lib from Tensorflow. How can I evaluate NPU tflite model on my PC?

标签 (1)
0 项奖励
回复
8 回复数

1,922 次查看
nnxxpp
Contributor IV

@Harry_Zhang 

Thanks. In the link MCXN947: How to Train and Deploy Customer ML model to NPU - NXP Community in the step 4 "4. Model Evaluation "VALIDATE", it allow valiate tflite model with option of choosing data type. 

You can see in the step 5, 6, we convert .h5 model to tflite and tflite to NPU tflite. I do not see about validate NPU tflite in eIQ. If I am wrong, please correct me.

0 项奖励
回复

1,832 次查看
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

Yes, you are right.

Eiq does not support verifying npu tflite.

BR

Harry

0 项奖励
回复

1,826 次查看
nnxxpp
Contributor IV

Hi @Harry_Zhang 

Thanks. Sure. It means that for MCXN947, we need to deploy NPU tflite model to board.

0 项奖励
回复

1,964 次查看
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

If you want to evaluate the performance of the model, you need to deploy it to the board and run it.

BR

Harry

0 项奖励
回复

1,955 次查看
nnxxpp
Contributor IV

@Harry_Zhang 

Thanks.

It is difficult to evaluate NPU tflite model on board. Because flash memory is very small and usually dataset has thousands of image, we can not push all images to the board and evaluate. NXP supports NPU tflite, it means that it has implementations of custom operator for NeutronGraph. I don't see any guide. Accuracy evaluation is one of the most key we consider when deploying tflite model to the board beside speed.

0 项奖励
回复

1,928 次查看
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

If you want to test the accuracy of the model, you can use eiq testing. You can refer to this.

MCXN947: How to Train and Deploy Customer ML model to NPU - NXP Community

BR

Harry

0 项奖励
回复

1,539 次查看
nnxxpp
Contributor IV

@Harry_Zhang 

Thanks. I expect that you can help me more. I think that you can do it.

Actually I am not sure whether it is worth to evaluate NPU tflite on PC, because NPU is special for NXP board. I know that you custom TFLM to support NPU tflite on board. Could you give me guide to run this custom TFLM on PC. Thank you.

0 项奖励
回复

1,492 次查看
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

Running a custom TensorFlow Lite Micro (TFLM) model designed for NPU acceleration on a PC can be challenging, as TFLM is primarily optimized for embedded systems and custom hardware.

BR

Harry

0 项奖励
回复