Armnn tensorflow-lite vs Tensorflow-lite on Imx8 plus

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Armnn tensorflow-lite vs Tensorflow-lite on Imx8 plus

Jump to solution
2,649 Views
dnozik
Contributor II

Hi ,

1.I working with Imx8 plus board. (zeus-5.4.70-2.3.0)

2.I am testing tensorflow-lite model with Tensorflow-Lite and ArmNN Tensorflow-lite 

3. These inferences use tensoflow-lite from the different versions that cause incompatibility of flatbuffers and compilation error !!!!!!!!!!!

Is there a way to fix it?  it influences all pipe of models train and execution !!!!!!!!!!!!

Thanks

0 Kudos
1 Solution
2,606 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello,

Flatbuffers for TFLite is located in: /usr/include/tensorflow/lite/tools/make/downloads/flatbuffers/. You need to include this path when you build TFLite app.

Flatbuffers for ArmNN is located in /usr/include

Regards

 

View solution in original post

0 Kudos
6 Replies
2,643 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello dnozik,

 

The tensorflow in the iMX8 run on the NPU so is running on hardware and is compatible with version 2.1 of tensorflow lite, if you are using another version may result incompatible.

Regards

 

0 Kudos
2,625 Views
dnozik
Contributor II

Hi, 

any update on that ... it kind of a critical issue for  us !!!

regards,

dima

0 Kudos
2,640 Views
dnozik
Contributor II

Hi , 

I am compiling my code with your ArmNN and Tesnorflow lite build according to https://source.codeaurora.org/external/imx/meta-imx/tree/meta-ml/recipes-libraries?h=zeus-5.4.70-2.3... 

 

ArmNN using flatbuffers version 1.11

Tensorflow using flatbuffers version 1.12

Both copies on recipe build finish flatbuffers folder into /usr/include .  !!!! So last one to build will set flatbuffers version.

 

So I have a compilation error and now I need to disable one of Inferences in order to run !!!

 

I need the option to use them both !!!

Dima

0 Kudos
2,607 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello,

Flatbuffers for TFLite is located in: /usr/include/tensorflow/lite/tools/make/downloads/flatbuffers/. You need to include this path when you build TFLite app.

Flatbuffers for ArmNN is located in /usr/include

Regards

 

0 Kudos
2,590 Views
dnozik
Contributor II

in my app I only  make this includes for tensorflow lite

// #include "tensorflow/lite/delegates/xnnpack/xnnpack_delegate.h"
// #include "tensorflow/lite/kernels/register.h"
// #include "tensorflow/lite/string_util.h"
// #include "tensorflow/lite/tools/evaluation/utils.h"

there is no flatbuffers include directly !!! tensorflow lite will include flatbuffers in default location usr/include

 

 

0 Kudos
2,601 Views
dnozik
Contributor II

Yes, thanks.

But why SDK support Tensorflow lite from tensorflow 2.3.1 and armNN with tensorflow from 1.15 . model cannot be executed on both inferences engine.

 

regards.

0 Kudos