ML-based system State Monitor

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

ML-based system State Monitor

1,717 Views
khinhtethtetaung25
Contributor I

Hi all! I am using Application Software Pack - ML-base System State Monitor on LPCXpresso55S69 for classifying different states of the fan (ON, OFF, CLOG, FRICTION). The prediction accuracy percentage on confusion matrix  for all state is over 95% on validation dataset. However, when tested the model on real-time dataset, the accuracy is 0.00% for FAN-ON state. May I know the reason behind it? 

Thank you so much!

#LPCXpresso55s69 #ml_state_monitor_cm33 @https://community.nxp.com/t5/eIQ-Machine-Learning-Software/Application-Software-Pack-ML-based-System-State-Monitor/ta-p/1413290

Labels (1)
0 Kudos
2 Replies

1,696 Views
anthony_huereca
NXP Employee
NXP Employee

Hi,

  I want to make sure I understand correctly: You've collected data on your own fan and when you do the training you get good results, but when you deploy the custom model back to your LPC55S69 then you get incorrect results where it can't recognize the ON state? Does it always report back the same state or does it vary between the other options? Which inference engine did you select for doing the inferencing with your custom model? 

  Make sure you've modified the #define SENSOR_FEED_VALIDATION_DATA to in \source\sensor\sensor_collect.h so that the model is being fed the correct accelerometer data. Also make sure the fan is in the same position/mode as when you were collecting the data to train your custom model. Let me know if that helped get it working for you.

 

-Anthony 

0 Kudos

1,651 Views
khinhtethtetaung25
Contributor I

Hi @anthony_huereca

Thank you so much for your reply! 

Yes, you are right. I get good results on both training dataset and validation dataset during the training on Jupyter-Notebook. However when the model is deployed on LPC55S69, the ON state is recognized as either OFF state or FRICTION state (Mostly, OFF State). I selected Tensorflow inference engine. 

Looking forward to your reply. 

Regards, 

Khin

0 Kudos