I am using the DSPI drivers supplied with S32DS. The SPI is in Master mode, interfacing to a SPI to UART device with a bit rate of 10Mbps. When I read a buffer of 38 bytes in the SPI to UART device using the interrupt driven DSPI_MasterTransfer() function, it takes 392us between calling the function and receiving the transfer complete interrupt. At this bitrate, this time should be a lot less. Clocking the data at the 10Mbps takes 30.4us. I don't understand what happens with the rest of the time? My guess is that this function should execute in no more than 40us. The MPC5744P is running at 200MHz.
Solved! Go to Solution.
Hi Pierre,
I'm sure that datasheet of your slave device provides specification of SPI timing. There are parameters like CS setup time, CS disable time and many others. You should make sure that all the specifications are met for reliable communication, so you should do some fine-tuning of mentioned parameters. At this bit rate, default configuration will not work as you mentioned. Any other configuration then will add some delay to transfers, so you won't see only pure time equal to number of bits * 1/10M. Yes, there's some SW overhead caused by the drivers but needed delays are more significant than the overhead.
The best way is to use an oscilloscope to compare the timing against spec and for fine-tuning.
Regards,
Lukas
Hi Pierre,
I did quick test and I'm able to achieve time < 100us.
How did you configured DSPI_MasterSetDelay? This may have significant impact on total time.
At this bit rate, I had to set some delay between transfers like this, otherwise I got transfer fail error:
DSPI_MasterSetDelay(INST_DSPI1, 1, 0, 0);
Using this configuration, I got about 94us.
If I use slower bit rate, this configuration is not necessary, can be left in default state. In this case, the time from calling the function to callback is equal to time when clocking out the data in a row using that bit rate.
Regards,
Lukas
Hi Lukas
Yes I did use the DSPI_MasterSetDelay() function. My implementation was DSPI_MasterSetDelay(INST_DSPI1,1,1,8); When I used DSPI_MasterSetDelay(INST_DSPI1, 1, 0, 0); I could not read the device. According to the comments in the driver software, the first parameter is the delay between transfers, the 2nd parameter the delay between SCK and CS and the third parameter the delay between CS and SCK. But when I changed the last parameter from 8 to 3, the time drastically reduced to 197us from 392us. If I make that parameter 2, I could not read the device. To me it looks like the last parameter is the delay between transfers.
Regards,
Pierre
Hi Pierre,
I'm sure that datasheet of your slave device provides specification of SPI timing. There are parameters like CS setup time, CS disable time and many others. You should make sure that all the specifications are met for reliable communication, so you should do some fine-tuning of mentioned parameters. At this bit rate, default configuration will not work as you mentioned. Any other configuration then will add some delay to transfers, so you won't see only pure time equal to number of bits * 1/10M. Yes, there's some SW overhead caused by the drivers but needed delays are more significant than the overhead.
The best way is to use an oscilloscope to compare the timing against spec and for fine-tuning.
Regards,
Lukas