AnsweredAssumed Answered

Using GPU on i.MX8qmek for DNN infernece

Question asked by Ullas Bharadwaj on Jun 14, 2020
Latest reply on Jun 24, 2020 by Ullas Bharadwaj

I am currently trying to evaluate different inference engines with TensorFlow and TensorFlow Lite models on i.MX8 QMEK. I follow the eIQ guide form NXP and using L4.14 Release.

 

I tried with OpenCV DNN module, TFLite Interpreter and Arm NN. I was not able to use GPU with any of them. I know OpenCV does not run on GPU due to OpenCL compatibility issue on i.MX8 but can I not also use GPU with TFLite and Arm NN? 

 

On other hand, the Arm NN examples with e.IQ does not provide an option to use GPU at all. 

 

In this thread, Arm NN support for the i.MX 8 GPUsVanessa Maegima suggested only TFLite engine supports GPU at the moment. 

 

So it is all confusing to me whether GPU is of any use on imx8qmek for running DNN inference.  

 

So if there is a way to use GPU, kindly let me know. This has me bogging my head from quite a while now. 

 

Best Regards 

Ullas Bharadwaj

Outcomes