Armnn tensorflow-lite vs Tensorflow-lite on Imx8 plus

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 
已解决

Armnn tensorflow-lite vs Tensorflow-lite on Imx8 plus

跳至解决方案
5,315 次查看
dnozik
Contributor II

Hi ,

1.I working with Imx8 plus board. (zeus-5.4.70-2.3.0)

2.I am testing tensorflow-lite model with Tensorflow-Lite and ArmNN Tensorflow-lite 

3. These inferences use tensoflow-lite from the different versions that cause incompatibility of flatbuffers and compilation error !!!!!!!!!!!

Is there a way to fix it?  it influences all pipe of models train and execution !!!!!!!!!!!!

Thanks

0 项奖励
回复
1 解答
5,272 次查看
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello,

Flatbuffers for TFLite is located in: /usr/include/tensorflow/lite/tools/make/downloads/flatbuffers/. You need to include this path when you build TFLite app.

Flatbuffers for ArmNN is located in /usr/include

Regards

 

在原帖中查看解决方案

0 项奖励
回复
6 回复数
5,309 次查看
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello dnozik,

 

The tensorflow in the iMX8 run on the NPU so is running on hardware and is compatible with version 2.1 of tensorflow lite, if you are using another version may result incompatible.

Regards

 

0 项奖励
回复
5,291 次查看
dnozik
Contributor II

Hi, 

any update on that ... it kind of a critical issue for  us !!!

regards,

dima

0 项奖励
回复
5,306 次查看
dnozik
Contributor II

Hi , 

I am compiling my code with your ArmNN and Tesnorflow lite build according to https://source.codeaurora.org/external/imx/meta-imx/tree/meta-ml/recipes-libraries?h=zeus-5.4.70-2.3... 

 

ArmNN using flatbuffers version 1.11

Tensorflow using flatbuffers version 1.12

Both copies on recipe build finish flatbuffers folder into /usr/include .  !!!! So last one to build will set flatbuffers version.

 

So I have a compilation error and now I need to disable one of Inferences in order to run !!!

 

I need the option to use them both !!!

Dima

0 项奖励
回复
5,273 次查看
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello,

Flatbuffers for TFLite is located in: /usr/include/tensorflow/lite/tools/make/downloads/flatbuffers/. You need to include this path when you build TFLite app.

Flatbuffers for ArmNN is located in /usr/include

Regards

 

0 项奖励
回复
5,256 次查看
dnozik
Contributor II

in my app I only  make this includes for tensorflow lite

// #include "tensorflow/lite/delegates/xnnpack/xnnpack_delegate.h"
// #include "tensorflow/lite/kernels/register.h"
// #include "tensorflow/lite/string_util.h"
// #include "tensorflow/lite/tools/evaluation/utils.h"

there is no flatbuffers include directly !!! tensorflow lite will include flatbuffers in default location usr/include

 

 

0 项奖励
回复
5,267 次查看
dnozik
Contributor II

Yes, thanks.

But why SDK support Tensorflow lite from tensorflow 2.3.1 and armNN with tensorflow from 1.15 . model cannot be executed on both inferences engine.

 

regards.

0 项奖励
回复