
Attached is the generated .dot file for the beginning of our model
You can see that the placeholder input accepts floats and immediately quantizes it with parameters [S:0.065829024 O:2] to feed into the rest of the model. Do I need to quantize the inputs into the placeholder, or will that node take care of it for me?
In that case, how do I input floating point numbers to the inference function? Just memcpy them in and let it take care of the conversion?
In this way
memcpy(bundleInpAddr, test_input, sizeof(test_input));
where test_input is a snippet of data, I do not seemingly update the inference function (it always returns the same probabilities) regardless of the input data.
Please advise