Issue with TensorFlow Lite and NPU on iMX93 Module Board

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Issue with TensorFlow Lite and NPU on iMX93 Module Board

88 Views
esmamirhan
Contributor I
Hello,

I am using iMX93 with SDK Linux_6.1.55_2.2.0.

I am encountering an issue while running a TensorFlow Lite object detection model on a custom board with the iMX93 module. When I attempt to start inference with this model, iMX93 gets stuck.

Additionally, I have observed that the device consistently gets stuck whenever I access the NPU, even when using the inference_runner and interpreter_runner utilities provided by NXP.

Do you have any ideas or suggestions on what could be causing this problem?

Example usage:

 

interpreter = tflite.Interpreter(model_path="model.tflite",
experimental_delegates=[tflite.load_delegate("/usr/lib/libethosu_delegate.so")])
interpreter.allocate_tensors()

 

 

tflite.Interpreter works well, but allocate_tensors() makes the cpu gets stuck.

 

Thank you in advance!
Labels (1)
Tags (2)
0 Kudos
Reply
1 Reply

9 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello,

Please send you test code to check it.

Regards

0 Kudos
Reply