Significant difference of AI inference result between CPU and NPU

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

Significant difference of AI inference result between CPU and NPU

3,782 次查看
Jack-Cha
Contributor V

Hi NXP AI Champs

We have some test result from bladder size calculation.

The project is to find out the size of the bladder by AI.

They use Onnx, and made AI inference with CPU and NPU with the same quantized onnx model learned.

However, even with the same quantized onnx model, the size (image) of the bladder inferred by the CPU is significantly about 10%–30% less than the size (image) of the bladder inferred by the NPU. There's also the opposite.

When measured with 105 input data, most of them differ by about 10%, and about 30 differ by more than 30%. It doesn't even show a certain proportion.

If they used the same Quantified onnx model, shouldn't the inference images of the CPU and NPU be almost the same? I would like to ask you an answer to this question, and if there is any way to solve it.

Thanks.

Regards, 

0 项奖励
回复
3 回复数

3,766 次查看
Jack-Cha
Contributor V

Hi Joanxie

The AP is used with i.MX 8M Plus

Part# : MIMX8ML8DVNLZAB (i.MX8M Plus)

 

Regards, 

0 项奖励
回复

3,735 次查看
joanxie
NXP TechSupport
NXP TechSupport

I have already mailed to you, pls check it

0 项奖励
回复

3,771 次查看
joanxie
NXP TechSupport
NXP TechSupport

what processor do you use

0 项奖励
回复