PIC Timers and Delays

Thread Starter

CEDEng

Joined Jul 17, 2007
6
If I were programming a PIC16 to cycle an output on and off slowwwwly, say, every 10 seconds, should I stick with a "nested delay" - or try to incorporate one of the timers?

Pros and Cons?

Thanks in advance for your help!

Regards...
 

thatoneguy

Joined Feb 19, 2009
6,359
If the only task the PIC will be doing is changing the output, delays can be used.

If the PIC needs to do other things, such as update an LCD, the timing can be done with an interrupt. Each Interrupt, add a count to a variable, when the count meets the limit, change the output.

Sourceboost C makes this relatively easy, either way, through the delay_s() {seconds} function, or in interrupts and a long integer. The free version will be able to handle this basic project.

You will want to use a very slow, but stable clock to reduce power as well, unless it needs to do a good deal of processing in addition to changing the pin output.

Which PIC were you thinking of using?
 

Thread Starter

CEDEng

Joined Jul 17, 2007
6
Thanks for the quick response.

It's a PIC16F690 - the PICKIT Play Chip!

Anyway - the application will do various checking of things throughout the day, until a "Bad Event" occurs. On detection of the Bad Event, a Red Alarm Light will blink until the CLEAR Event is seen.

So - other than watching each loop to see if the CLEAR Event has happened, the processor is doing nothing but circles - few seconds ON, few seconds off, repeat.

Knowing this, TMRxxx or Loop? I'm not opposed to doing a TMR and Interrupt routine - but I AM opposed to adding complexity where none is necessary.

Any thoughts? Thanks again...
 

thatoneguy

Joined Feb 19, 2009
6,359
I posted a sample of using interrupts with Boost C in This Thread

It isn't extremely complex. In your example, in the interrupt() function, you test to see if the interrupt was caused by the timer or by an external pin input, and set the variables accordingly. I usually don't do any processing in the interrupt loop, just set global flags.

Then in the main while(1) loop, test for timer flag, test for pin interrupt flag, do other stuff, and repeat. That way counting delay loops aren't "confused" when an interrupt hits inside one.
 

THE_RB

Joined Feb 11, 2008
5,438
I like to use a real one-second generator somewhere in the loop, and it can be used to sequence slow events like time-outs on the user interface etc. Of course once you have an accurate 1 second timer built into your program you can use it for a heap of stuff.

There's lots of code examples here; http://www.romanblack.com/one_sec.htm
that let you generate the 1 second period from any xtal freq and any interrupt frequency.
 

thatoneguy

Joined Feb 19, 2009
6,359
I like to use a real one-second generator somewhere in the loop, and it can be used to sequence slow events like time-outs on the user interface etc. Of course once you have an accurate 1 second timer built into your program you can use it for a heap of stuff.

There's lots of code examples here; http://www.romanblack.com/one_sec.htm
that let you generate the 1 second period from any xtal freq and any interrupt frequency.
I read that a while back. I really like the ZEZJ algorithm!
 
Top