The serial protocol I'm implementing has packet and inter-character timeouts. Is there a way besides polling the driver for available characters to implement timeouts like windows and linux support?
Currently we use blocking until the first character is received, then switch to non-blocking and poll for available characters. There has to be a better way!
I have thought about using two timer event to unblock the task that calls getc(). The first timer event callback would unblock the calling task when the packet timeout occurs, the second callback would unblobk the task at the end of each inter-character timeout. The inter-character timeout would be cancelled with the reception of each character, and then restarted.
All that seems very cumbersome.
Does anyone have a better idea?