No, as I initially explained: If I configure MSTTIME to 0x33, I should get 5 clock ticks low and 5 clock ticks high according to the manual, but instead I get 6 clock ticks low and 5 clock ticks high. That's what my scope shot shows: the period is too long by one 4MHz clock tick and the duty cycle is not symmetrical.
Now if I configure MSTSCLOW to 2 instead of 3 and leave MSTSCLHIGH at 3, I should get 4 clock ticks low and 5 clock ticks high, but I get 5 clock ticks low and 5 clock ticks low, resulting in exactly 400kHz.
So that 1.3µs guaranteed minimum has no influence whatsoever on my observation. And again: I didn't even aim at a 50% DC. My initial 400kHz DC setup was (and still is) 36%, but the limited resolution of the lo/hi time doesn't really allow that on the 15xx (more like 40%). As explained before, I just used the 50% for trouble shooting and to document the different behavior of t_low and t_high with the same configuration.
Anyway, even if this minimum guaranteed low time would actually extend the low period (which, as far as I recall wasn't the case in my measurements) , this would only change the DC, not the frequency. Well, unless you suggest some physical slew rate pad behavior would somehow influence the internal timing of the I2C peripheral. So this is where this while data sheet idea crumbles to dust. IMHO the only sensible explanation for a wrong frequency (and let's mention again that it's off by exactly one I2C peripheral clock tick) is that the manual doesn't fit the I2C implementation for the 15xx.