Timer function for Pic

Discussion in 'Programmer's Corner' started by djsfantasi, Sep 22, 2013.

  1. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    I am interested in porting a Basic program to a Pic using C. One of the functions I depend on is a TIMER function, which returns the number of milliseconds since some base time. This base time is irrelevant; what I need is the elapsed time since this base time, such as when the program started.

    Currently, I am using a FreeBASIC function TIMER which uses Windows' system calls to determine the time since midnight.

    How can this be implemented in C on a Pic platform?
     
  2. t06afre

    AAC Fanatic!

    May 11, 2009
    5,939
    1,222
    Can you elaborate a little more on what your program will be doing. Like how long will the intervals between call to this function be. From the top of my head. I would have used a timer creating interrupt and a counter for these events
     
  3. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    The program will be performing a multitude of tasks. The timer will be used to determine when to execute the next step in the program.

    It is a program that implements an interpretive language to control servo motors in an animatronic. Since the servos take milliseconds to move and the program loops in microseconds, multiple servo commands are being executed in a time shared manner.

    The program is described in this article http://back2basic.phatcode.net/?Issue_#6:BASIC_programming_for_Multi-Tasking_Control It is a FreeBASIC program that runs on a Windows PC that I want to port to C running on a Pic to embed it within the animatronic.

    So for example, I want to move a servo from right to left and I know it will take 300ms. I don't want to send another move command until after that time has expired. However, I can move other servos while I am waiting, so I continue processing other commands for 300ms before moving the first servo again. The timer function will help me keep track if when it is ok to send a command to the first servo. The program is more complex than this example, but I provide it as the base concept.

    Please ask for additional clarification if necessary.
     
  4. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,388
    1,605
    I've done this several times over. Briefly, I set up a timer (Timer 2 if I recall correctly) so I generate an interrupt each and every millisecond, and use that interrupt to increment a global variable. By reading that variable you get Timer function.

    Sorry my code is not on this computer or I'd share some.
     
  5. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,908
    2,168
    If it's a 8 bit controller and you're using a 32 bit global time variable just be sure that when you read the time variable outside the ISR you temporarily disable the interrupt during the read so it won't get mangled by byte rollovers in the ISR increment code.
     
  6. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    I do this a lot. Basically you setup a timer interrupt to occur every 1mS.

    I often prefer hundredths of a second (interrupt every 10mS), that's better as a single byte can time up to 2.55 seconds. Hundredths are good for button debouncing, LED flash timing, user screen delays etc. Most of my projects have a timer interrupt at 1mS or 10mS.

    You can do it in a couple of ways, TMR2 is easy;

    Code ( (Unknown Language)):
    1.  
    2. // global vars needed;
    3. unsigned char count_mS1;
    4. unsigned char count_mS2;
    5.  
    6. // TMR2 interrupt, occurs every 1mS
    7. interrupt()
    8. {
    9.   PIR1.TMR2IF = 0;      // clear overflow flag
    10.  
    11.   count_mS1 ++;     // inc any number of counters, each mS
    12.   count_mS2 ++;
    13. }
    14.  
    15. // setup stuff in main()
    16. // PIC at 4MHz = timers at 1MHz,
    17. // set TMR2 to 4:1 prescaler, TMR2 is now running at 250kHz
    18. TMR2 = blah       // (setup to suit your PIC)
    19. PR2 = (250-1);    // TMR2 now rolls every 250 ticks (every 1mS)
    20.  
    Using TMR2 may interfere with PWM operation etc depending what else your PIC is doing.

    This next way uses TMR0, and a nifty math trick to make 1mS periods from ANY xtal speed.

    Code ( (Unknown Language)):
    1.  
    2. // global vars needed;
    3. unsigned int bres;
    4. unsigned char count_mS1;
    5. unsigned char count_mS2;
    6.  
    7. // TMR0 interrupt, occurs every 512 ticks
    8. interrupt()
    9. {
    10.   TMR0IF = 0;      // clear overflow flag
    11.   // see if we reached 1mS yet!
    12.   bres += 512;        // add 1 int (512 ticks)
    13.   if(bres >= 2500)    // if > 1mS (2.5M / 1000)
    14.   {
    15.     bres -= 2500;     // subtract 1mS, keep remainder for next time
    16.     // gets here every 1mS!
    17.     count_mS1 ++;     // inc any number of counters, each mS
    18.     count_mS2 ++;  
    19.   }
    20. }
    21.  
    22. // setup stuff in main()
    23. // PIC at 20MHz = timers at 5MHz,
    24. // set TMR0 to 2:1 prescaler, TMR0 is now running at 2.5MHz
    25. TMR0 = blah       // (setup to suit your PIC)
    26.  
    Both these systems will give zero accumulated error over time (ie will give "clock" accuracy). The second system can be better as it can be tuned by changing some numbers in code, without placing demands on TMR2, and can also be used for any value xtal your project might be using.

    Using a single byte variable for the user variable (count_mS1) means that you can read it anytime without affecting the interrupt. That's a good reason to work in hundredths (10mS), not mS, as a single byte gets you ability to time up to 2.55 secs.
     
    djsfantasi likes this.
  7. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    What about longer time periods, such as hours instead of minutes? My current timer variables are double precision. In theory and for commercial use, long int would be necessary. In practice and for testing/hobby use, unsigned integer would be necessary at a minimum. So, what did you mean, nsaspook by mangling the ISR?

    Hundredths would work, if I were to modify my current a-code shows (referring to the previous link) to change the time specified from ms to hundredths. I never specify a time less than 10 ms in the scripts.

    I saw your second example on your web site some time ago, but lost the link.

    Thanks guys!
     
  8. t06afre

    AAC Fanatic!

    May 11, 2009
    5,939
    1,222
    Do you really need msec resolution. I think 1/25 or 1/50 seconds resolution should be more than good enough
     
  9. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,388
    1,605
    What he means is it is quite common to work with quantities larger then the intrinsic size of the processor, example to use a 32 bit quantity when your arithmetic is done 8 bits at a time.

    When you do this there is a finite probability you will attempt to read the current tick count while that same tick count is being updated.

    This can lead to disastrous results.

    Turning off interrupts works, as does several other methods.
     
  10. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,908
    2,168
    If you need 16 bit atomic read/writes on a pic18 you can also use a spare 16 bit timer register using the TRM#H buffer to hold the upper 8 bits of the 16bit read/write in a single instruction.
     
  11. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    I incorporate real clocks into that system like this;

    Code ( (Unknown Language)):
    1.  
    2. // TMR0 interrupt, occurs every 512 ticks
    3. interrupt()
    4. {
    5.   TMR0IF = 0;      // clear overflow flag
    6.   // see if we reached 10mS yet!
    7.   bres += 512;        // add 1 int (512 ticks)
    8.   if(bres >= 25000)   // if > 10mS (2.5M / 100)
    9.   {
    10.     bres -= 25000;    // subtract 10mS, keep remainder for next time
    11.     // gets here every 10mS!
    12.     count1 ++;     // inc any number of counters, each 10mS
    13.     count2 ++;  
    14.     if(count2 >= 100)
    15.     {
    16.       count2 = 0;
    17.       secflag = 1;
    18.     }
    19.   }
    20. }
    21.  
    You can see in the interrupt, the global var secflag is SET every second. It is like your own personal "1 second interrupt flag".

    Then you handle the real time clock in main() using simple code like this;

    Code ( (Unknown Language)):
    1.  
    2. // real time clock in the main() loop;
    3. // (this code must be run at least once per second.
    4. // normally it is part of display code which is run
    5. // many times a second.)
    6. if(secflag == 1)
    7. {
    8.   secflag = 0;
    9.   secs++;
    10. }
    11. if(secs >= 60)
    12. {
    13.   secs = 0;
    14.   mins++;
    15. }
    16. if(mins >= 60)
    17. {
    18.   mins = 0;
    19.   hours++;
    20. }
    21. if(hours >= 24)
    22. {
    23.  hours = 0;
    24. }
    25. Display_Time();  
    26.  
     
    Last edited: Sep 24, 2013
  12. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    I also assume that I can define count_ms as an unsigned_short_long or unsigned_long to get longer maximum time periods instead of char?
     
  13. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,388
    1,605
    Yes, as long as you abide by the cautions we have previously mentioned.
     
  14. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    How long are the time periods you need?

    If you look at the way the secflag is handled in the interrupt, that section of code occurs every second.

    It's very easy to inc another variable every second at the same time;
    secflag = 1; // used for real-time clock
    my_seconds++; // used for any timing task 0-255 seconds

    Then if you want to make a long delay (like 20 seconds) you just clear the global variable my_seconds in your code, then wait until my_seconds >= 20. So you can time a delay from 1-255 seconds using the variable my_seconds. I'm sure you get the idea. :)

    I prefer to use byte sized variables wherever possible when they are modified in the interrupt, it avoids the problems with reading longer vars that other people have mentioned. Provided the granularity of 0-255 is fine there is little need to use 16bit or 32bit variables in your timing!

    And of course if you need LONG timed periods you have a perfectly accurate real-time clock in HH:MM:SS:hh (down to hundredths of a second) that can be used for real world timing tasks, like turning the heater on at 7:45 etc.
     
  15. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    The timed periods can be hundredths of a second to tens of seconds. But they represent timing of multiple parallel tasks, which need a common relative base time. Hence, if the animatronic puppet is to run for four hours, it will need to time over an elapsed period of 1,440,000 hundredths of a second. Am I being clear?
     
  16. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,388
    1,605
    Since log(2, 1440000) = 20.5 (about), any variable type that can hold at least 21 bits will contain your maximum quantity.
     
  17. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    So it looks like the unsigned short long integer type is the appropriate data type in the Microchip C18 compiler.
    • Since negative numbers are not needed, an unsigned type is indicated.
    • Anything less than a short long (24 bits) is too small (only going to 65,545 hundredths or 10 minutes...)
    • An unsigned long (32 bits) is overkill - would last for 497 hours.
    • An unsigned short long would last for 46 hours and I don't expect the animatronic to have to run for that long :D
    Thanks guys, and ErnieM - I'll watch out for your cautions!
     
  18. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    So this is for an animatronic puppet playback? It would be good if you outlined some specs or procedure. Is there a particular format the puppet movement playback data is in?

    For simplicity you could use a real-time system of seconds and hundredths, stored as a single variable using a 32bit or 24bit var;
    123400 = 1234 secs 00 hundredths

    Then the start and end time of each movement can be stored with one simple number for each.

    For implementation, I would run the main loop at one loop per hundredth of a second (ie 100 Hz), and every loop check if a motor needs to be turned on/off etc according to your playback data.

    It REALLY helps if your playback data is already time sorted, so you can go through it in sequence. So it would be good to decide how you are going to create and edit animatronic playback files for multi-hour playback, then write the code later to to the actual playback.
     
  19. djsfantasi

    Thread Starter AAC Fanatic!

    Apr 11, 2010
    2,804
    833
    Playback data is random. The code is non-deterministic, so it cannot be predicted when it will start. The data simply states for example "move servos 1,5 & 11 to their positions and wait 20 hundredths of a second before performing the next step". This may be in a definition of a subroutine, which may be called at several different random times in the script. This is all done during runtime.

    The main loop executes other tasks while multiple routines are executing. It is really a mini- multi tasking OS, such as synchronizing the mouth to audio. Detecting if someone comes close, determining if commands are coming over the internet, which scripts to run and which servos to move.

    Hence the requirement to base timing from the same common base time. Eg when the program starts.
     
Loading...