delay routine function in c code

Thread Starter

Parth786

Joined Jun 19, 2017
642
I am looking help to understand c programming code in given site http://www.embeddedmarket.com/products/89V51RD2-Starter-Kit/Buzzer-Interfacing-8051-Microcontroller/ I am stuck in delay routine program
I understand this delay function in c
Code:
void Delay(int i)
{
   for(i=0;i<100;i++)
}
here variable name is i and it initialize in for loop

I don't understand below function why there is n and how does this function work in main program?
Code:
void Delay(int n)
{
    int j;
    int i;
   for(i=0;i<n;i++)
    {
        for(j=0;j<1000;j++)
        {
        }
    }
}
 

AlbertHall

Joined Jun 4, 2014
12,619
The value passed as n will determine the length of the time delay - larger n will give more loops in the program and hence longer delay time.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
The value passed as n will determine the length of the time delay - larger n will give more loops in the program and hence longer delay time.
I have searched on google. I fond sample program but actually I don't have idea about how this program will work. can you provide some more information. assume I need four delays 10ms, 100ms 500ms 1000ms. how to apply concepts
 

AlbertHall

Joined Jun 4, 2014
12,619
The inner loop (lines 7,8,9) provides a fixed delay. The actual delay time for this loop depends on the processor speed and the way the compiler converts the 'C' into machine code.
The outer loop (lines 5,6,10) runs the inner loop 'n' times.
As 'n' is declared as an integer it probably allows values up to 32767 (compiler dependent).
So the total delay is 'n' times the fixed delay from the inner loop.

To determine the actual delay time, you can either try values and see what delays you get. If the delay times need be only approximate this method will work and is relatively easy.

To accurately determine the delay time you need to look at the compiled machine code and understand the time each instruction takes on your processor.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
The inner loop (lines 7,8,9) provides a fixed delay. The actual delay time for this loop depends on the processor speed and the way the compiler converts the 'C' into machine code.
The outer loop (lines 5,6,10) runs the inner loop 'n' times.
As 'n' is declared as an integer it probably allows values up to 32767 (compiler dependent).
So the total delay is 'n' times the fixed delay from the inner loop.

To determine the actual delay time, you can either try values and see what delays you get. If the delay times need be only approximate this method will work and is relatively easy.

To accurately determine the delay time you need to look at the compiled machine code and understand the time each instruction takes on your processor.
I appreciate your explanation but I have some doubts look my pseudo code
Code:
void Delay(unsigned int n)
{
    unsigned i,j ;
    for(i=0;i<n;i++) 
    for(j=0;j<1200;j++);
}
void main (void)
{               
    while(1)             
    {
        BUZZER = 1;
        Delay(20);
        BUZZER = 0;
        Delay(500);            
        BUZZER = 1;
        Delay(700);
        BUZZER = 0;
        Delay(1000);            
     
    }
}
as per my Knowledge If I want to use many delays than I have to make function and after that I have to call that function in main function like as I did in above example

How do we decide that how many variable we need to use in this type of delay function
Code:
void Delay(unsigned int n)
{
    unsigned i,j,k;
    for(i=0;i<n;i++) 
    for(j=0;j<500;j++)
    for(k=0;k<1200;k++);
}
In this function I have declared four variables n.i,j and K and using four loops. How do we decide that how many variable we need to use in this type of delay function what will this function do?
 

AlbertHall

Joined Jun 4, 2014
12,619
It depends how long a delay you need and to decide you will have to either experiment and see what delay you get, or do it the hard way by working through the machine code.

The C compiler for Microchip includes routines that you specify delays in milliseconds or microseconds but I don't know whether your '8051 compiler does this for you.
 

spinnaker

Joined Oct 29, 2009
7,830
I appreciate your explanation but I have some doubts look my pseudo code
Code:
void Delay(unsigned int n)
{
    unsigned i,j ;
    for(i=0;i<n;i++)
    for(j=0;j<1200;j++);
}
void main (void)
{             
    while(1)           
    {
        BUZZER = 1;
        Delay(20);
        BUZZER = 0;
        Delay(500);          
        BUZZER = 1;
        Delay(700);
        BUZZER = 0;
        Delay(1000);          
   
    }
}
as per my Knowledge If I want to use many delays than I have to make function and after that I have to call that function in main function like as I did in above example

How do we decide that how many variable we need to use in this type of delay function
Code:
void Delay(unsigned int n)
{
    unsigned i,j,k;
    for(i=0;i<n;i++)
    for(j=0;j<500;j++)
    for(k=0;k<1200;k++);
}
In my opinion, this is HORRIBLE practice

void Delay(unsigned int n)
{
unsigned i,j,k;
for(i=0;i<n;i++)
for(j=0;j<500;j++)
for(k=0;k<1200;k++);
}

Awfully confusing.

For many compilers the max values of an unsigned is 65,535. Just pass up to the max value. Call delay multiple tmes if needed.


But if you really want to do something this confusing, there is no need to specify multiple variables.


void Delay(unsigned int n)
{
unsigned i;
for(i=0;i<n;i++)
for(i=0;i<500;i++)
for(i=0;i<1200;i++);
}

Will work just fine because i is initialized every time in each for statement.

Also


void Delay(unsigned int n)
{

for(unsigned i=0;i<n;i++)
for(unsigned i=0;i<500;i++)
for(unsigned i=0;i<1200;i++);
}


should work for your compiler. In this case, i is instantiated and initialized 3 times . Basically it is 3 different i variables.


You REALLY need to take a class. This is very, very basic stuff that you learn in the first day of class. Learning C one question at a time through a forum is not the way to learn.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
In my opinion, this is HORRIBLE practice

But if you really want to do something this confusing, there is no need to specify multiple variables.
Will work just fine because i is initialized every time in each for statement.

You REALLY need to take a class. This is very, very basic stuff that you learn in the first day of class. Learning C one question at a time through a forum is not the way to learn.
I am taking classes, in class teacher has to complete course within one month. during the 45 minutes of class, there are multiple questions run in my mind. so I ask only some doubts. sometime I feel that some other students are disturbing because of me, so some time I don't ask question. and also teacher could not give the all answer of my doubts because there are so many students. teacher has to manage all students. today I was going to ask on another topics " Interfacing of keypad" but now I have postponed I will ask after day.I think I am asking so many questions and people are feeling same like class students.

This question came in my mind when I was searching code for message display on LCD. so I did google search and I gone to top 15 search but I didn't find reason so I asked here. have you seen my signature " Don't give up! The beginning is always the hardest"
 
Last edited:

MrSoftware

Joined Oct 29, 2013
2,273
Disclaimer - I'm not familar with the specific processor you have chosen, this might turnout to be bad advice. ;)

Using loops for delay works, but is not always best. It ties up the processor (called busy waiting), and also makes your delay dependent on the clock speed of your processor, which makes the code less portable to other processors. Also if the compiler is set to optimize the code, it may remove loops that do nothing, effectively removing your delay. So in short, if you want to delay in specific units (seconds, miliseconds, etc..) first look for a library funciton that does this already (sleep(), delay(), etc..). Plan B, see if the processor has a timer that you can read and hang out until the timer has incremented the appropriate number of ticks. Or even better, but a little more advanced, set the timer to trigger an interrupt when the time has expired. This way your processor can continue doing other things until it is interrupted by the timer. This is assuming your chosen processor has timer and interrupt features. Good luck!
 

Picbuster

Joined Dec 2, 2013
1,057
I agree with MrSoftware.
Best practice is to use interrupt and count down (from a given value) till zero.
This will have no effect on the main or other programs.

Setting the given value allows to create different delays.
The function call could be:

if (!Delay( 1234); )
{
// take action when zero
}
continue loop
or wait until zero
micro_sec=4567;
while (!Delay(micro_sec);)

Picbuster
 
Top