How can I evaluate NPU tflite model on my PC?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

How can I evaluate NPU tflite model on my PC?

2,015 Views
nnxxpp
Contributor IV

I converted tflite model to tflite model for NPU. With normal tflite model, I can use Tflite lib from Tensorflow. How can I evaluate NPU tflite model on my PC?

Labels (1)
0 Kudos
Reply
8 Replies

1,923 Views
nnxxpp
Contributor IV

@Harry_Zhang 

Thanks. In the link MCXN947: How to Train and Deploy Customer ML model to NPU - NXP Community in the step 4 "4. Model Evaluation "VALIDATE", it allow valiate tflite model with option of choosing data type. 

You can see in the step 5, 6, we convert .h5 model to tflite and tflite to NPU tflite. I do not see about validate NPU tflite in eIQ. If I am wrong, please correct me.

0 Kudos
Reply

1,833 Views
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

Yes, you are right.

Eiq does not support verifying npu tflite.

BR

Harry

0 Kudos
Reply

1,827 Views
nnxxpp
Contributor IV

Hi @Harry_Zhang 

Thanks. Sure. It means that for MCXN947, we need to deploy NPU tflite model to board.

0 Kudos
Reply

1,965 Views
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

If you want to evaluate the performance of the model, you need to deploy it to the board and run it.

BR

Harry

0 Kudos
Reply

1,956 Views
nnxxpp
Contributor IV

@Harry_Zhang 

Thanks.

It is difficult to evaluate NPU tflite model on board. Because flash memory is very small and usually dataset has thousands of image, we can not push all images to the board and evaluate. NXP supports NPU tflite, it means that it has implementations of custom operator for NeutronGraph. I don't see any guide. Accuracy evaluation is one of the most key we consider when deploying tflite model to the board beside speed.

0 Kudos
Reply

1,929 Views
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

If you want to test the accuracy of the model, you can use eiq testing. You can refer to this.

MCXN947: How to Train and Deploy Customer ML model to NPU - NXP Community

BR

Harry

0 Kudos
Reply

1,540 Views
nnxxpp
Contributor IV

@Harry_Zhang 

Thanks. I expect that you can help me more. I think that you can do it.

Actually I am not sure whether it is worth to evaluate NPU tflite on PC, because NPU is special for NXP board. I know that you custom TFLM to support NPU tflite on board. Could you give me guide to run this custom TFLM on PC. Thank you.

0 Kudos
Reply

1,493 Views
Harry_Zhang
NXP Employee
NXP Employee

Hi @nnxxpp 

Running a custom TensorFlow Lite Micro (TFLM) model designed for NPU acceleration on a PC can be challenging, as TFLM is primarily optimized for embedded systems and custom hardware.

BR

Harry

0 Kudos
Reply