How make converted to tflite work? (guide on NXP site may be outdated)

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

How make converted to tflite work? (guide on NXP site may be outdated)

865 Views
korabelnikov
Contributor III

I follow the article https://community.nxp.com/t5/Software-Community-Articles/eIQ-Sample-Apps-TFLite-Quantization/ba-p/11...

 

I've installed tensorflow strictly the same version as in article, and downloaded the same model.

I've got converted .tflite model, that's seems good in netron visualizer.

 

Then (in opposite to the article) I try run it pyEIQ. pyEIQ is checked, it worked with some tflite models.

But I get error

Spoiler
self.interpreter = aNNInterpreter(model_file)
File "/usr/lib/python3.7/site-packages/eiq/engines/armnn/inference.py", line 19, in __init__
network = parser.CreateNetworkFromBinaryFile(model)
File "/usr/lib/python3.7/site-packages/pyarmnn/_generated/pyarmnn_tfliteparser.py", line 711, in CreateNetworkFromBinaryFile
return _pyarmnn_tfliteparser.ITfLiteParser_CreateNetworkFromBinaryFile(self, graphFile)
RuntimeError: Buffer #88 has 0 bytes. For tensor: [1,14,14,512] expecting: 401408 bytes and 100352 elements. at function CreateConstTensor [/usr/src/debug/armnn/19.08-r1/git/src/armnnTfLiteParser/TfLiteParser.cpp:2612]

I think the issue is with tensorflow version, that converts tensorflow to tflite. But there is no more info about tf version to use.

 

Additionally I've tried modern versions of tensorflow i.e. 2.3.0, 2.3.1 but got segfault(139)

0 Kudos
0 Replies