I'm trying to read bytes on UART1 sent from my PC to the FRDM-K66F. I can see the bytes properly transferring from the PC to the RX input pin on the K66F, but the bytes as read by the software - using interrupt method, instead of polling - are all garbage. The kUART_NoiseErrorFlag is set nearly all the time, too.
I notice that the RX pin is creating a wave form that is probably contributing to the problem. In image 1, you can see the periodic dips in the yellow wave. Those are being generated by the RX pin itself, because if I disconnect the RX pin from the 'scope, I get a clean signal line. See image 2.
I've tried filtering the line with various sized capacitors, but the smaller ones don't do anything, and the larger ones just filter out all mark/space changes.
Why is the RX pin generating this waveform, and how do I read the input bytes on the µcontroller without the noise?
Solved! Go to Solution.
I think I figured it out: it seems like a clock issue. I added a line of code to repeatedly send the character 'a' to the PC from the K66, and the PC couldn't properly read it, and the oscilloscope image looked a little weird, too: stretched out. Looking at the PC (yellow) wave, and the K66 (purple) wave shows that their timing is not identical, though it's supposed to be an 'a' in both cases. I adjusted the clock settings in the Clock perspective, and ran my tests again: I'm getting useful characters as I expect. So, thanks to everyone for the assistance.
I think I figured it out: it seems like a clock issue. I added a line of code to repeatedly send the character 'a' to the PC from the K66, and the PC couldn't properly read it, and the oscilloscope image looked a little weird, too: stretched out. Looking at the PC (yellow) wave, and the K66 (purple) wave shows that their timing is not identical, though it's supposed to be an 'a' in both cases. I adjusted the clock settings in the Clock perspective, and ran my tests again: I'm getting useful characters as I expect. So, thanks to everyone for the assistance.
Additional info: I added UART2 for testing, thinking maybe there was something physically wrong with UART1, and it behaves the same way, with the bit patterns as mentioned, above, on the UART2 pins. (Only RX is used, at the moment.) So it doesn't seem to be a physical problem with the FRDM-K66, but a configuration thing.
I figured out the oscillating wave form, but I'm still getting garbage input from UART_ReadByte, or UART_ReadBlocking. Here's the pattern I've observed, and it's consistent across device resets:
'a' 141 - 0110 0001 : 216 - 1101 1000
'b' 142 - 0110 0010 : 249 - 1111 1001
'c' 143 - 0110 0011 : 249 - 1111 1001
'd' 144 - 0110 0100 : 218 - 1101 1010
'A' 101 - 0100 0001 : 208 - 1101 0000
For example, when I press 'a' on the keyboard, I see an 'a' on the 'scope screen, but reading the byte in the code returns 216, instead of 141. And so on with the other values. 'b' and 'c' are weird in that they both return 249, instead of individual values.
The UART and serial emulator on the PC are both configured for 9600/n/8/1. Again, the values I listed are consistent across restarts, but do show up correctly on the oscilloscope.
Just my 1 cent: Capacitors are not the solutions imho. I think you need to think about pull-resistors instead. This all depends on what your PC side is doing. I had similar issues in the past, see https://mcuoneclipse.com/2014/03/30/getting-bluetooth-working-with-jy-mcu-bt_board-v1-06/
In the above case, the easy solution was to enable the MCU internal pull-ups on the RX side of the MCU.
I hope this helps,
Erich
Thanks, Erich! Actually, it turned out to be even more mundane, though I did configure the pin for internal pull-up, based on your suggestion. After posting, last night, I reset the test device, and the RX line was smooth. It must've been some stray oscillation noise I'd picked up during previous testing. But the problem is still there. See my other comment in this thread.