How can we upgrade default "libneuralnetworks.so" that match our bsp version ?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

How can we upgrade default "libneuralnetworks.so" that match our bsp version ?

1,017 Views
wuhuangcangg
Contributor I


bsp version: 5.10.72-2.2.0 imx8plus

Our project use Tensorflow-lite + NNAPI to invoke model by npu.
We want to use matmul op in our project. Howerver, when we test with matmul op, device showed error :
WARNING: Operator FULLY_CONNECTED (v5) refused by NNAPI delegate: Android sdk version less than 30

Does this mean we need to upgrade our "libneuralnetworks.so" ? If necessary , where can we get a library that match our bsp version 5.10.72-2.2.0 ?

Thank you !

0 Kudos
Reply
1 Reply

994 Views
brian14
NXP TechSupport
NXP TechSupport

Hi @wuhuangcangg

I'm not sure about upgrading "libneuralnetworks.so". The problem is related to the operation and Android SDK. For this BSP version, the NNAPI implementation is based on an Android SDK version that doesn't support this operation.

I can suggest using the VX Delegate instead of NNAPI, which could offer better performance and operation. The VX Delegate uses the OpenVX driver via TIM-VX, resulting in better functionality. You can use the VX Delegate by entering the following command line: "--external_delegate_path=/usr/lib/libvx_delegate.so".

Here is an example of running a model in NPU using the VX Delegate:

$ USE_GPU_INFERENCE=0 ./label_image -m mobilenet_v1_1.0_224_quant.tflite -i grace_hopper.bmp -l labels.txt --external_delegate_path=/usr/lib/libvx_delegate.so.

Please try this option and let me know if it works.

Have a great day!

0 Kudos
Reply