Port Data Input Register Delay - GPIO

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Port Data Input Register Delay - GPIO

544 Views
ednasalazar
Contributor II

hello everybody.

I am working currently with GPIO interruptions.using k53.

I detected a delay of 2-3µS when external interrupt occurs until interrupt status flag is set.

Can i find this time imformation on Mk53 datasheet??

Labels (1)
0 Kudos
2 Replies

391 Views
egoodii
Senior Contributor III

You didn't mention whether you had Port Filtering Enabled on said I/O, or HOW you came to this particular 'delay' conclusion.  But certainly there is an 'appreciable' interrupt overhead in hardware, and this assumes that your handler is directly addressed as the vector for that port.  And if you are measuring this 'timing' by an output GPIO, be aware also of write-buffer delays in the implementation of THAT operation.

0 Kudos

391 Views
mjbcswitzerland
Specialist V

Hi

Since the inputs are detected by sampling at the bus rate (assuming no additional filtering) it is necessary to know what bus speed is being used. If the processor is running slowly also the port input detection delay will be longer than when it is running quickly.

It takes maybe 50..100 instruction cyces to take the interrupt, detect its cause and signal on an output for measurement purposes so expect also several hundred ns to several us - again depending on the system clock.

Generally, without specifying the system and bus clocks used for the measurement (and the expected interrupt and execution time involved), the results can be interpreted as either "gosh, that is slow" or "wow, that is fast".

When measuring and reporting results one needs to be as specific as possible.

Although possibly a typo beware that there is no such thing as a uS delay since uS (micro-siemens) is a unit of electric conductance; use the correct units (us) to avoid the question looking sloppy in various respects.

Essentially I agree that there is no specific definition in the manuals but expect that one could show that the actual detection of an (unfiltered) edge will take 2 bus clock periods (first to detect the state before the edge and the second to detect the state after the edge - with some set-up/hold-time uncertainty that may add a jitter of an additional bus clock period) plus the following exception and software delays. Level detection will presumably be generally the same except when enabled with the level already valid (possibly down to 0 bus clocks).

Regards

Mark