[i.MX 8X]Some questions about hardware acceleration of AI models

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

[i.MX 8X]Some questions about hardware acceleration of AI models

838 Views
ShenYi
Contributor II

Hello,

I want to deploy the onnx model on i.MX 8X to perform face detection. But I found that i.MX 8X is not mentioned in the eIQ ML Software Development Environment.

EIQ.PNG

So,

(1) Can the GPU of i.MX 8X accelerate the inference speed of AI models? Is there any reference documents?

(2) Can GPU or A35 of i.MX 8X support ONNX RUNTIME?

(3) Does i.MX 8X support performance optimization for OpenCV?

 

Expect for your reply.

 

0 Kudos
Reply
3 Replies

766 Views
pengyong_zhang
NXP Employee
NXP Employee

Hi @ShenYi 

Sorry, We do not have tested it, But from openvinoOpenVINO™ supports inference on CPU (x86, ARM), GPU (OpenCL capable, integrated and discrete) and AI accelerators (Intel NPU). So i think maybe you can try it on your site. This is not an area we support.

B.R

0 Kudos
Reply

791 Views
pengyong_zhang
NXP Employee
NXP Employee

Hi @ShenYi 

(1) Can the GPU of i.MX 8X accelerate the inference speed of AI models? Is there any reference documents?

>>> No, i.MX8X GPU do not support this.

(2) Can GPU or A35 of i.MX 8X support ONNX RUNTIME?

>>>GOU does not support, CPU can run it.

(3) Does i.MX 8X support performance optimization for OpenCV?

>>>Yes, i.MX 8X support OpenCV

B.R

0 Kudos
Reply

775 Views
ShenYi
Contributor II
Hello,
Thanks for your reply.
Another question is whether A35 support OpenVINO?
0 Kudos
Reply