[i.MX 8X]Some questions about hardware acceleration of AI models

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

[i.MX 8X]Some questions about hardware acceleration of AI models

843件の閲覧回数
ShenYi
Contributor II

Hello,

I want to deploy the onnx model on i.MX 8X to perform face detection. But I found that i.MX 8X is not mentioned in the eIQ ML Software Development Environment.

EIQ.PNG

So,

(1) Can the GPU of i.MX 8X accelerate the inference speed of AI models? Is there any reference documents?

(2) Can GPU or A35 of i.MX 8X support ONNX RUNTIME?

(3) Does i.MX 8X support performance optimization for OpenCV?

 

Expect for your reply.

 

ラベル(1)
0 件の賞賛
返信
3 返答(返信)

771件の閲覧回数
pengyong_zhang
NXP Employee
NXP Employee

Hi @ShenYi 

Sorry, We do not have tested it, But from openvinoOpenVINO™ supports inference on CPU (x86, ARM), GPU (OpenCL capable, integrated and discrete) and AI accelerators (Intel NPU). So i think maybe you can try it on your site. This is not an area we support.

B.R

0 件の賞賛
返信

796件の閲覧回数
pengyong_zhang
NXP Employee
NXP Employee

Hi @ShenYi 

(1) Can the GPU of i.MX 8X accelerate the inference speed of AI models? Is there any reference documents?

>>> No, i.MX8X GPU do not support this.

(2) Can GPU or A35 of i.MX 8X support ONNX RUNTIME?

>>>GOU does not support, CPU can run it.

(3) Does i.MX 8X support performance optimization for OpenCV?

>>>Yes, i.MX 8X support OpenCV

B.R

0 件の賞賛
返信

780件の閲覧回数
ShenYi
Contributor II
Hello,
Thanks for your reply.
Another question is whether A35 support OpenVINO?
0 件の賞賛
返信