MODEL_RunInference in evkmimxrt1060_tensorflow_lite_kws example

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

MODEL_RunInference in evkmimxrt1060_tensorflow_lite_kws example

454 Views
nahan_trogn
Contributor III

Hi,

I run the evkmimxrt1060_tensorflow_lite_kws example.

After preprocessing audio, I see two function called after that to inference the output: MODEL_RunInference(); and MODEL_ProcessOutput(outputData, &outputDims, outputType, endTime - startTime);

But the example model use the Softmax Regression :

nahan_trogn_0-1618472173929.png

 

And my model use Logistic Regression:

 

nahan_trogn_1-1618472364781.png

 

 

The question is can I re-use 2 function above for my model? If not, could you suggest me somthing to run inference for my model?

0 Kudos
3 Replies

437 Views
nahan_trogn
Contributor III

I can use those 2 function to predict the output, thanks for your reply.

0 Kudos

435 Views
kerryzhou
NXP TechSupport
NXP TechSupport

Hi nahan_trogn,

   Glad to hear you can reuse it on your side.

  If you still have questions about this case, just kindly let me know.

 Any new issues, welcome to create the new case.

Best Regards,

Kerry

0 Kudos

446 Views
kerryzhou
NXP TechSupport
NXP TechSupport

Hi nahan_trogn,

  Do you use the DeepView tool for your related picture?

  About your Logistic Regression. I think you can try to call that 2 function after your model and test it on your side.

  If you meet any issues, just kindly let me know.

Best Regards,

Kerry

0 Kudos