Significant difference of AI inference result between CPU and NPU

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Significant difference of AI inference result between CPU and NPU

787 Views
JK_Cha
Contributor IV

Hi NXP AI Champs

We have some test result from bladder size calculation.

The project is to find out the size of the bladder by AI.

They use Onnx, and made AI inference with CPU and NPU with the same quantized onnx model learned.

However, even with the same quantized onnx model, the size (image) of the bladder inferred by the CPU is significantly about 10%–30% less than the size (image) of the bladder inferred by the NPU. There's also the opposite.

When measured with 105 input data, most of them differ by about 10%, and about 30 differ by more than 30%. It doesn't even show a certain proportion.

If they used the same Quantified onnx model, shouldn't the inference images of the CPU and NPU be almost the same? I would like to ask you an answer to this question, and if there is any way to solve it.

Thanks.

Regards, 

0 Kudos
3 Replies

771 Views
JK_Cha
Contributor IV

Hi Joanxie

The AP is used with i.MX 8M Plus

Part# : MIMX8ML8DVNLZAB (i.MX8M Plus)

 

Regards, 

0 Kudos

740 Views
joanxie
NXP TechSupport
NXP TechSupport

I have already mailed to you, pls check it

0 Kudos

776 Views
joanxie
NXP TechSupport
NXP TechSupport

what processor do you use

0 Kudos