Hello BP,
I think there are a few points that need to be considered.
You are using a crystal type originally intended for use in wrist watches, that when worn, are in a fairly "constant temperature" environment. The frequency drift with temperature change can be significantly higher for this type of low frequency crystal, than for a higher frequency. I presume your product may be subject to considerably wider temperature variation, and you could be talking multiples of 10 ppm.
The crystal frequency will drift with time, due to aging. Look at the data for the crystal you are using, but I suspect it is unlikely to be less than 1 ppm for the first 12 months.
Jitter of the oscillator output will have little effect on a "normal" RTC application, where resolution is one second. The average frequency is what needs to be measured. When you measure the frequency, you should be aware of the averaging period for the measurement setup, and choose a period in the region of one second, for any fine adjustment - for coarse adjustment the measurement period could be less.
For a factory adjustment procedure, there is probably little point in attempting to adjust finer than about 0.5-1.0 ppm. With your suggested figure of 2 ppm, and a measured jitter of approximately 0.2 ppm, you should be able to easily adjust to this accuracy. Just ignore any variation of the least significant digit of the counter reading.
The change of frequency (13 ppm) is the sort of figure to be expected with the capacitive loading by the oscilloscope probe. If you want to measure crystal frequency directly, you would require the hardware buffer approach, as previously described.
If you don't want to add the hardware, you will need the special firmware specifically for the measurement. One possibility is to commence output of the calibration signal (the code you have posted would be suitable), whenever a special frequency calibrate mode is initiated. This would then continue until the next reset, when normal operation would resume.
Regards,
Mac