I'm reopening a issue similar to https://community.nxp.com/t5/eIQ-Machine-Learning-Software/Compiling-large-models-fails/m-p/1134552#... , since there was no solution provided.
I'm facing a similar issue when compiling a large model of size 4251592 bytes.
Error:
cc1plus.exe: out of memory allocating 67112959 bytes
make: *** [source/model/subdir.mk:27: source/model/model.o] Error 1
make: *** Waiting for unfinished jobs....
How did it encounter this error:
I have created a tflite model with toco converter named model.tflite and i converted it to flatbuffer using xxd with the following command.
xxd -i model.tflite > model_data.h
I included the header file in my code and compiled it.
@david_piskula & @lars1 , Please let me know if you have solution for this issue.
Thanks & Regards,
Ramson Jehu K