Significant difference of AI inference result between CPU and NPU

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

Significant difference of AI inference result between CPU and NPU

3,777件の閲覧回数
Jack-Cha
Contributor V

Hi NXP AI Champs

We have some test result from bladder size calculation.

The project is to find out the size of the bladder by AI.

They use Onnx, and made AI inference with CPU and NPU with the same quantized onnx model learned.

However, even with the same quantized onnx model, the size (image) of the bladder inferred by the CPU is significantly about 10%–30% less than the size (image) of the bladder inferred by the NPU. There's also the opposite.

When measured with 105 input data, most of them differ by about 10%, and about 30 differ by more than 30%. It doesn't even show a certain proportion.

If they used the same Quantified onnx model, shouldn't the inference images of the CPU and NPU be almost the same? I would like to ask you an answer to this question, and if there is any way to solve it.

Thanks.

Regards, 

0 件の賞賛
返信
3 返答(返信)

3,761件の閲覧回数
Jack-Cha
Contributor V

Hi Joanxie

The AP is used with i.MX 8M Plus

Part# : MIMX8ML8DVNLZAB (i.MX8M Plus)

 

Regards, 

0 件の賞賛
返信

3,730件の閲覧回数
joanxie
NXP TechSupport
NXP TechSupport

I have already mailed to you, pls check it

0 件の賞賛
返信

3,766件の閲覧回数
joanxie
NXP TechSupport
NXP TechSupport

what processor do you use

0 件の賞賛
返信