PIC Timers and Delays

Discussion in 'Embedded Systems and Microcontrollers' started by CEDEng, Nov 22, 2010.

  1. CEDEng

    Thread Starter New Member

    Jul 17, 2007
    6
    0
    If I were programming a PIC16 to cycle an output on and off slowwwwly, say, every 10 seconds, should I stick with a "nested delay" - or try to incorporate one of the timers?

    Pros and Cons?

    Thanks in advance for your help!

    Regards...
     
  2. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    If the only task the PIC will be doing is changing the output, delays can be used.

    If the PIC needs to do other things, such as update an LCD, the timing can be done with an interrupt. Each Interrupt, add a count to a variable, when the count meets the limit, change the output.

    Sourceboost C makes this relatively easy, either way, through the delay_s() {seconds} function, or in interrupts and a long integer. The free version will be able to handle this basic project.

    You will want to use a very slow, but stable clock to reduce power as well, unless it needs to do a good deal of processing in addition to changing the pin output.

    Which PIC were you thinking of using?
     
  3. rajbex

    New Member

    Nov 2, 2009
    22
    0
  4. MMcLaren

    Well-Known Member

    Feb 14, 2010
    759
    116
    It's a shame about the loss of program formatting on that site...
     
  5. retched

    AAC Fanatic!

    Dec 5, 2009
    5,201
    312
    Agreed.

    The designer should pop open the CSS file and straighten that out.

    Well, unless that is the way his code is actually formatted.
     
  6. CEDEng

    Thread Starter New Member

    Jul 17, 2007
    6
    0
    Thanks for the quick response.

    It's a PIC16F690 - the PICKIT Play Chip!

    Anyway - the application will do various checking of things throughout the day, until a "Bad Event" occurs. On detection of the Bad Event, a Red Alarm Light will blink until the CLEAR Event is seen.

    So - other than watching each loop to see if the CLEAR Event has happened, the processor is doing nothing but circles - few seconds ON, few seconds off, repeat.

    Knowing this, TMRxxx or Loop? I'm not opposed to doing a TMR and Interrupt routine - but I AM opposed to adding complexity where none is necessary.

    Any thoughts? Thanks again...
     
  7. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    I posted a sample of using interrupts with Boost C in This Thread

    It isn't extremely complex. In your example, in the interrupt() function, you test to see if the interrupt was caused by the timer or by an external pin input, and set the variables accordingly. I usually don't do any processing in the interrupt loop, just set global flags.

    Then in the main while(1) loop, test for timer flag, test for pin interrupt flag, do other stuff, and repeat. That way counting delay loops aren't "confused" when an interrupt hits inside one.
     
  8. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    I like to use a real one-second generator somewhere in the loop, and it can be used to sequence slow events like time-outs on the user interface etc. Of course once you have an accurate 1 second timer built into your program you can use it for a heap of stuff.

    There's lots of code examples here; http://www.romanblack.com/one_sec.htm
    that let you generate the 1 second period from any xtal freq and any interrupt frequency.
     
    thatoneguy likes this.
  9. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    I read that a while back. I really like the ZEZJ algorithm!
     
  10. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    Thank you. :)

    That ZEZJ is a powerful little thing, from just a few lines of code.
     
Loading...