Hi,
I am trying to measure the time interval between a data request and a data confirmation in a single transmitting device but I am not 100 % sure my time class is correct or even if this can simply be measured by getting the times in the mcps-data.request and mcps-data.confirm. Perhaps there are some delays I am not aware of? I am using JN-AN-1174 as a base.
My time class looks like this:
PRIVATE void vTimer0Callback(uint32 u32Device, uint32 u32ItemBitmap)
{
tickCount++;
}
PUBLIC void vInitTimer ()
{
tickCount = 0;
vAHI_TimerEnable (E_AHI_TIMER_0,
4,
FALSE,
TRUE,
FALSE);
vAHI_TimerClockSelect (E_AHI_TIMER_0,
FALSE,
TRUE);
vAHI_Timer0RegisterCallback(vTimer0Callback);
vAHI_TimerStartRepeat(E_AHI_TIMER_0,
0,
60000);
}
PUBLIC uint32 vGetMicroSeconds()
{
return (u16AHI_TimerReadCount(E_AHI_TIMER_0) + (60000 * tickCount));
}
Inside PRIVATE void vTransmitDataPacket(uint8 *pu8Data, uint8 u8Len, uint16 u16DestAdr)
I get the current time :
DBG_vPrintf(TRUE,"%d MicroSec | Transmit Request with size %d \n",vGetMicroSeconds(),sMcpsReqRsp.uParam.sReqData.sFrame.u8SduLength);
vAppApiMcpsRequest(&sMcpsReqRsp, &sMcpsSyncCfm);
and inside PRIVATE void vHandleMcpsDataDcfm(MAC_McpsDcfmInd_s *psMcpsInd)
I get the current time again:
if (psMcpsInd->uParam.sDcfmData.u8Status == MAC_ENUM_SUCCESS)
{
DBG_vPrintf(TRUE,"%d MicroSec | Data transmit Confirm \n",vGetMicroSeconds());
}
Finally, I transmit 3 packets of 4 bytes:
vTransmitDataPacket(data,4,0x0000);
vTransmitDataPacket(data,4,0x0000);
vTransmitDataPacket(data,4,0x0000);
The result:
20726445 MicroSec | Transmit Request with size 4
20731540 MicroSec | Transmit Request with size 4
20736663 MicroSec | Transmit Request with size 4
20741786 MicroSec | Data transmit Confirm
20746197 MicroSec | Data transmit Confirm
20750579 MicroSec | Data transmit Confirm
0 random backoff and retries :
MAC_vPibSetMaxCsmaBackoffs(s_pvMac,0);
MAC_vPibSetMinBe(s_pvMac,0);
The time between the first request and the first confirm should be not more than 4256 microseconds (dataframe transmission time) + 128 (CCA time) + 0 random backoff and No ack equals to 4384 microseconds for the case of the max payload (114 bytes), but instead, my test with only 4 bytes of payload resulted in 15,341 microseconds (20741786 - 20726445). Could anybody point me in the right direction?
Is the time class or something else?