ISP auto baud rate detection fails with 115200 baud rate

Discussion created by lpcware Employee on Jun 15, 2016
Latest reply on Jun 18, 2016 by lpcware
Content originally posted in LPCWare by olltsu on Tue Mar 08 08:56:48 MST 2016
When I took a brand new LPC1785 which has no application flashed inside, powered it on and started sending a '?' char from minicom with a 115200 baud rate, the controller recogniced this and sent "Synchronized" back.
The problem is that the baud rate is not exactly 115200 and the data gets corrupted on the way. When looking with oscilloscope the individual bits take ~9.2 us when with 115200 they should take ~8.7 us (1 / 115200).
The problem appears only with 115200 bauds, all smaller baud rates work okay. However I would like to use as fast baudrate as possible to speed up the flashing.

I tested this manually with minicom, but lpc21isp cannot synchronize with the controller either, most probably because of this same problem.

What could be the problem? I'd like to note that this happened with a brand new LPC1785 without any of my own software flashed in it. I come from LPC11xx controllers and with them I have never had such problems.

Could this problem be unique for only this single controller and caused by a bad IRC calibration or am I doing something wrong?