The problem isn't the PIT timer. It delivers an interrupt or DMA request on a very timely basis. It is far more accurate then your requirements.
Set up your pit timer to trigger an ADC conversion on a consistent basis and have it free run. Have the pit timer routine count the number of ADC readings it triggered. You then just need to keep the count and ADC reading together. If you need 1millisecond sample time have your ADC triggered every millisecond. Every count of an ADC trigger will then be 1 millisecond. If you use a 32bit int to count the ADC conversions it will give you a roll over every 49 days at a 1 millisecond resolution.
Yes, there will be some jitter between the time the ADC is triggered and when the results are available, this should be well below your timing resolution.
This works best if you use the PIT to drive the PDB which then triggers the ADC and then DMA the ADC results into an array. Using the DMA will reduce the load on the CPU. Take a look at App Note AN4590 on how to use the DMA in ADC conversions.
We use the PIT to trigger the PDB which triggers the first ADC reading in a list of ADC conversions, We use 2 DMAs to cycle through a set of ADC conversions. So 1 trigger with several ADC channel conversions at the PIT Triggered time.
This method should give you the ability to get well below 1 millisecond sampling rates.