Hi ,
1.I working with Imx8 plus board. (zeus-5.4.70-2.3.0)
2.I am testing tensorflow-lite model with Tensorflow-Lite and ArmNN Tensorflow-lite
3. These inferences use tensoflow-lite from the different versions that cause incompatibility of flatbuffers and compilation error !!!!!!!!!!!
Is there a way to fix it? it influences all pipe of models train and execution !!!!!!!!!!!!
Thanks
Solved! Go to Solution.
Hello,
Flatbuffers for TFLite is located in: /usr/include/tensorflow/lite/tools/make/downloads/flatbuffers/. You need to include this path when you build TFLite app.
Flatbuffers for ArmNN is located in /usr/include
Regards
Hello dnozik,
The tensorflow in the iMX8 run on the NPU so is running on hardware and is compatible with version 2.1 of tensorflow lite, if you are using another version may result incompatible.
Regards
Hi,
any update on that ... it kind of a critical issue for us !!!
regards,
dima
Hi ,
I am compiling my code with your ArmNN and Tesnorflow lite build according to https://source.codeaurora.org/external/imx/meta-imx/tree/meta-ml/recipes-libraries?h=zeus-5.4.70-2.3...
ArmNN using flatbuffers version 1.11
Tensorflow using flatbuffers version 1.12
Both copies on recipe build finish flatbuffers folder into /usr/include . !!!! So last one to build will set flatbuffers version.
So I have a compilation error and now I need to disable one of Inferences in order to run !!!
I need the option to use them both !!!
Dima
Hello,
Flatbuffers for TFLite is located in: /usr/include/tensorflow/lite/tools/make/downloads/flatbuffers/. You need to include this path when you build TFLite app.
Flatbuffers for ArmNN is located in /usr/include
Regards
in my app I only make this includes for tensorflow lite
// #include "tensorflow/lite/delegates/xnnpack/xnnpack_delegate.h"
// #include "tensorflow/lite/kernels/register.h"
// #include "tensorflow/lite/string_util.h"
// #include "tensorflow/lite/tools/evaluation/utils.h"
there is no flatbuffers include directly !!! tensorflow lite will include flatbuffers in default location usr/include
Yes, thanks.
But why SDK support Tensorflow lite from tensorflow 2.3.1 and armNN with tensorflow from 1.15 . model cannot be executed on both inferences engine.
regards.