Generating 1 second delay using timers in 8051

Discussion in 'Embedded Systems and Microcontrollers' started by bobparihar, Aug 13, 2014.

  1. bobparihar

    Thread Starter Member

    Jul 31, 2014
    93
    0
    i am Generating 1 second delay using timers in 8051
    my concept is like this
    iam using 11.0592 crystal frequency

    time required for 1 machine cycle is1.085 micro second

    if timer is set to 0000h then time required for timer to run once is 65536*1.085= 7.11 ms

    now if 7.11*142(times the loop runs) = 1000ms=1sec

    so here is my code

    Code ( (Unknown Language)):
    1.  
    2. #include<reg51.h>
    3. void delay(void);
    4. void main()
    5. {
    6.   while(1)
    7.   {
    8.     P2=0x00;                // led off
    9.  
    10.     delay();                // delay for 1sec
    11.     P2=0xFF;                // led on
    12.  
    13.     delay();                // delay for  1 sec
    14.   }
    15. }
    16.  
    17.  
    18.     void delay(void)
    19.         {
    20.             int i;
    21.             TMOD=0x01;                 // timer 0 in mode 1
    22.             for(i=0;i<142;i++)
    23.                 {
    24.                     TL0=0x00;         // starting value from 0
    25.                     TH0=0x00;
    26.                     TR0=1;              // sart timer
    27.                     while(TF0==0);       // polling TF flag for high
    28.                     TR0=0;               // stop timer
    29.                     TF0=0;                // clear flag TF0
    30.  
    31.                 }
    32.         }
    33.  
    but im not getting any 1 second delay..please correct me where iam wrong
     
    Last edited by a moderator: Aug 13, 2014
  2. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,386
    1,605
    "but im not getting any 1 second delay" tells us nothing.

    Telling us "but I see a delay of X" is useful.

    Note:
    65536 * 1.085us = 71.11ms
    71.11ms * 142 = 10.0978 sec

    Also, polling the timer in a loop will introduce some error too.

    How exact do you want the delay?
     
  3. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    Read the compiler's help file for "delay functions" like
    Delay_mS(1000)
    which makes a 1 second delay.
     
  4. bobparihar

    Thread Starter Member

    Jul 31, 2014
    93
    0
    ok i got it i have to make a loop which runs 14 times..for 1 second delay
    but still problem is where it was.. according to the code delay should be 10 sec longer... but i was getting the delay of milli seconds when i burn the code in hardware..
     
  5. bobparihar

    Thread Starter Member

    Jul 31, 2014
    93
    0

    thanks RB.. but i want to do it with timers.. i dont want alternative at this time..my task is with timers
     
  6. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    Then your code of starting and stopping the timer is a bad choice.

    Just let the timer free-run, and count the number of overflows. Usually there is a timer overflow flag you can test for, without stopping the timer.

    After enough overflows you have timed 1 second.

    Now I have to ask WHY do you need to time 1 second? Is this for some type of clock?
     
  7. bobparihar

    Thread Starter Member

    Jul 31, 2014
    93
    0

    YES YES!.. u got it bravo :)
     
  8. ErnieM

    AAC Fanatic!

    Apr 24, 2011
    7,386
    1,605
    Oops... if you are making a clock you need a one second tick to be exactly one second. Yours is about 0.44% off.

    Each day that is 384 seconds of error.
     
  9. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    If you need to make a clock the "seconds" must be accurate over a long time with no accumulated error from your code.

    Try some of the examples on this page;
    http://www.romanblack.com/one_sec.htm
    :)
     
Loading...