How make converted to tflite work? (guide on NXP site may be outdated)

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

How make converted to tflite work? (guide on NXP site may be outdated)

1,053件の閲覧回数
korabelnikov
Contributor III

I follow the article https://community.nxp.com/t5/Software-Community-Articles/eIQ-Sample-Apps-TFLite-Quantization/ba-p/11...

 

I've installed tensorflow strictly the same version as in article, and downloaded the same model.

I've got converted .tflite model, that's seems good in netron visualizer.

 

Then (in opposite to the article) I try run it pyEIQ. pyEIQ is checked, it worked with some tflite models.

But I get error

スポイラ
self.interpreter = aNNInterpreter(model_file)
File "/usr/lib/python3.7/site-packages/eiq/engines/armnn/inference.py", line 19, in __init__
network = parser.CreateNetworkFromBinaryFile(model)
File "/usr/lib/python3.7/site-packages/pyarmnn/_generated/pyarmnn_tfliteparser.py", line 711, in CreateNetworkFromBinaryFile
return _pyarmnn_tfliteparser.ITfLiteParser_CreateNetworkFromBinaryFile(self, graphFile)
RuntimeError: Buffer #88 has 0 bytes. For tensor: [1,14,14,512] expecting: 401408 bytes and 100352 elements. at function CreateConstTensor [/usr/src/debug/armnn/19.08-r1/git/src/armnnTfLiteParser/TfLiteParser.cpp:2612]

I think the issue is with tensorflow version, that converts tensorflow to tflite. But there is no more info about tf version to use.

 

Additionally I've tried modern versions of tensorflow i.e. 2.3.0, 2.3.1 but got segfault(139)

0 件の賞賛
0 返答(返信)