Hi
The TICK, defined in ms, is the period of the hardware timer's interrupt which is used to control software timers.
A value of 50ms is typically used since it allows software timer accuracy resolution on a 50ms raster, which is adequate for a lot of projects where software delays are not very short and the resolution of the shortest delay (50..100ms) is acceptable.
If the software only requires longer delays it would be possible to increase to a larger values but if the software requires shorter delays, with higher resolution, the value can also be reduced - down to 1. A value of 1 means that the software delays are defined in ms with a resolution of +0/+1ms - and the shortest SW delay would be 1ms (with accuracy 1..2ms, assuming the code that started the timer was not synchrinised to the TICK. If the code is synchronised to the TICK (eg. a software timer that fires is used to start another SW timer delay) the accuracy is also better).
The trade-off is that there is an interrupt at the TICK rate - higher accuracy requires a faster rate of interrupt, which means a little more CPU loading and also more wake up from low power mode in the process (if LP mode is used).
The software timers are based on a 32 bit value which means that the maximum delay reduces as the TICK period reduces. 4.2million seconds is max. at 1ms TICK and longer delays with larger TICK values - although this is rarely something that needs to be considered.
A PIT timer is used as TICK source for the Coldfire V2 project and the comment in the code is just stating that it would be possible to have a TICK of up to 70000ms with this timer due to the fact that it is a 32 bit timer. In comparison, some 16 bit HW timers will limit the maximum TICK to just several hundred ms at high clock rates. Therefore it is just stating that the PIT source s a very good one without any practical upper limits in this case.
Regards
Mark