Author Topic: Software timer for 100-200ms intervals?  (Read 11030 times)

Offline akorud

  • Newbie
  • *
  • Posts: 31
    • View Profile
Software timer for 100-200ms intervals?
« on: September 21, 2008, 11:04:32 AM »
Hi,
I've just noticed that
Code: [Select]
uTaskerGlobalMonoTimer( (UTASK_TASK)(OWN_TASK | HARDWARE_TIMER), (DELAY_LIMIT)(200 * MILLISEC), E_TIMER_NODE_TIMEOUT );works fine but
Code: [Select]
uTaskerGlobalMonoTimer( (UTASK_TASK)(OWN_TASK), (DELAY_LIMIT)(200 * MILLISEC), E_TIMER_NODE_TIMEOUT );is never fired. Is this normal situation?

Best regards,
--
Andriy Korud

Offline mark

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3236
    • View Profile
    • uTasker
Re: Software timer for 100-200ms intervals?
« Reply #1 on: September 21, 2008, 10:11:18 PM »
Hi Andriy

When using a hardware timer the delay is specified in ms. The value is calculated for the hardware timer count value, which will be clock and timer dependent (when used on different processors).

When using a software timer the delay is specified in SEC - this is derived from the TICK (not hardware dependent).

200 * MILLISEC will give a value of 0x3d8 when using the demo project configuration - this is used to control the hardware timer to generate a match after 200ms. If the same value is used for the software timer it will mean 0x3d8 TICKs - this is about 190s when using a 50ms TICK. Therefore, if you wait three minutes the timer will in fact fire...

This, when using working with software timers you need to specify the delay in SEC as follows:
(DELAY_LIMIT)(0.2*SEC)

You will then get the 200ms delay.

Note that the reason for this is efficiency of the calculations, but it does mean you need to be a little careful to use the correct units with the timer type.

Regards

Mark


Offline akorud

  • Newbie
  • *
  • Posts: 31
    • View Profile
Re: Software timer for 100-200ms intervals?
« Reply #2 on: September 22, 2008, 11:14:02 AM »
Thanks, everything is clear now.
BTW, are there any suggestions selecting hardware or software timers? Any disadvantage of widely using hardware timers?

best regards,
--
Andriy

Offline mark

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3236
    • View Profile
    • uTasker
Re: Software timer for 100-200ms intervals?
« Reply #3 on: September 22, 2008, 12:30:24 PM »
Hi Andriy

Hardware timers have better resolution that SW timers. The SW timers use the TICK (the resolution) whereas the HW timers use a counter value, which can be in the ns range. Both use a central tack for managing the events and so the delay to waking the calling task (latency) depends on the time taken to wake first the timer task and search through the timer queues.

There have been some improvements made to the operation of the hardware timers recently but there seem to be some instances where there could still be some problems (cases where certain combinations result in a timer firing too soon) - the exact operation is hardware timer dependent and so may be restricted to certain types. Therefore I would still avoid HW timers if possible. SW timers have however been used successfully for a long time with no know issues.

The M5223X also supports PIT1 and 4 DMA timers - giving 5 very high accuracy timers with callback support (see application.c for a demo of their use) and often these are adequate for specific high accuracy use since they also have no extra management overhead and latency as for the HW timers in the Global timer support.

Regards

Mark

Offline akorud

  • Newbie
  • *
  • Posts: 31
    • View Profile
Re: Software timer for 100-200ms intervals?
« Reply #4 on: September 22, 2008, 03:08:21 PM »
Thanks,
it seems I have to look at them for purpose of signal sampling for further DSP processing...

regards,
--
Andriy

Offline akorud

  • Newbie
  • *
  • Posts: 31
    • View Profile
Re: Software timer for 100-200ms intervals?
« Reply #5 on: September 22, 2008, 07:11:26 PM »
Thanks, Mark - PIT1 works like a charm and perfectly fits my needs.
BTW, does uTasker use hardware timer? - I wonder if I can safely undefine GLOBAL_HARDWARE_TIMER  if my application does not use it?

regards,
--
Andriy

Offline mark

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 3236
    • View Profile
    • uTasker
Re: Software timer for 100-200ms intervals?
« Reply #6 on: September 23, 2008, 12:26:36 AM »
Hi Andriy

The PIT1 and the DMA Timers are very easy to use (the PIT1 actually automatically adjusts the prescaler it works with to get the best resolution for the particular delay) and the DMA timers can be used to generate very long and highly accurate delays (as well as generating DMA transfers or output frequencies).

GLOBAL_HARDWARE_TIMER can be safely undefined, as can GLOBAL_TIMER_TASK if you don't need multiple mono-stable timers for a single task.
The only time that GLOBAL_TIMER_TASK is automatically enabled is when SUPPORT_DISTRIBUTED_NODES is enabled - this is used in systems with distributed software. This is something which I use a lot in my own projects (where machines have an internal network running the complete software on a number of different hardware nodes in a distributed fashion) but I don't think is otherwise used by many people.

Regards

Mark