I have hacked the fsl_sai.c driver to support 8-frame tdm on sai1 as a bit clock and frame sync master and successfully drove an AKM4458 eval board with the imx8mq eval board. My next step is to configure the imx8mq eval board as frame sync and bit clock slave, but after making the appropriate changes, I get nothing. Writes to the pcm device in ALSA eventually error out with a non-descript "I/O error", and it appears that the sai1 driver irq is never entered. I can see fsl_sai_trigger being called and the BCD bit in TCR2 and FSD bit in TCR4 appear to be cleared. I've tried both TX async, RX sync (sync to TX clock), and TX async RX async (async, TX should still use TX clock).
Does the pinctrl configuration need to be explicitly changed, or is the input/output direction automatically changed by changing the TCR registers? If there's something about the pinctrl configuration that has to change, I either didn't find it or didn't understand it enough to change it properly.
BCLK and FS signals look fine on the EVK as supplied by the external hardware.
Is there something else that must be done in software or the kernel configuration to change a working SAI1 clock master to a clock slave ?
There's something a bit odd about that trigger function, also. The original code:
* sets FIFO request DMA enable
* sets Transmitter enable (FSL_SAI_CSR_TERE)
* sets FSL_SAI_CSR_SE, which maps to bit 30. The documentation I have (Rev.0, 1/2018) shows this as "reserved"!
* sets TERE for the opposite side (TX/RX) if in sync mode
* sets some additional interrupt enable bits.
The ordering seems odd to me. Shouldn't everything be set up including the interrupt enables BEFORE enabling the transmitter? And what's up with the reserved bit? At first I thought that might be a typo for _SR, but if that were the case, there'd have to be a clear to go along with the set, and I'd also think that it would have to come first.
Any hints would be appreciated,