I am currently trying to evaluate different inference engines with TensorFlow and TensorFlow Lite models on i.MX8 QMEK. I follow the eIQ guide form NXP and using L4.14 Release.
I tried with OpenCV DNN module, TFLite Interpreter and Arm NN. I was not able to use GPU with any of them. I know OpenCV does not run on GPU due to OpenCL compatibility issue on i.MX8 but can I not also use GPU with TFLite and Arm NN?
On other hand, the Arm NN examples with e.IQ does not provide an option to use GPU at all.
So it is all confusing to me whether GPU is of any use on imx8qmek for running DNN inference.
So if there is a way to use GPU, kindly let me know. This has me bogging my head from quite a while now.