Dear Pablo Avalos,
Thank you for your response. I am tweaking the project `mimxrt685audevk_i2s_dma_ping_pong_buffer` to give me the lowest digital loopback latency. For that, I managed to get the CODEC running in quad speed mode and at 192 kHz. Therefore, I made the following changes to the source file i2s_dma_ping_pong_buffer.c:
```
#define DEMO_I2S_CLOCK_DIVIDER 4
#define DEMO_BUFFER_SIZE 48
//#define DEMO_I2S_TX_MODE kI2S_MasterSlaveNormalMaster
//#define DEMO_I2S_RX_MODE kI2S_MasterSlaveNormalMaster
#define DEMO_I2S_TX_MODE kI2S_MasterSlaveNormalSlave
#define DEMO_I2S_RX_MODE kI2S_MasterSlaveNormalSlave
cs42448_config_t cs42448Config = {
// .DACMode = kCS42448_ModeSlave,
// .ADCMode = kCS42448_ModeSlave,
.DACMode = kCS42448_ModeMasterQSM,
.ADCMode = kCS42448_ModeMasterQSM,
// .master = false,
.master = true,
```
I found however, that the size of the buffer does not work out properly. Decreasing it below 48 bytes I get bad out put. My current measure of latency used is by sending in white noise and measure the 360 deg point of phase. Then the delay in samples is is 192 kHz / (f_360). What I expect to have at minimum is the CS42448 codec input group delay + codec output group delay, which are in quad speed mode 5 + 2.5 samples = 7.5 samples, then adding the minimal digital buffer, which would be 2 samples, which makes in total ~ 9.5 samples of latency, which is 49 microseconds at 192 kHz.
When the DEMO_BUFFER_SIZE is currently at my minimum value of 48 bytes, for which I expect with stereo at 16 bit a buffer of 12 samples per channel. I do expect a latency of at least 31 samples of latency. I strangely get a value of only 19 samples. I do not understand how the buffer size influences the callback rate.
It would be really helpful / appreciated if you could send me an example, where the callback is runned every 1/192e3 seconds, and for example, scale the input by, say 0.5 and return the output back. That would give us a headstart with some DSP'ing.
With kind regards,
Anne de Jong