efficient delay function

Thread Starter

Parth786

Joined Jun 19, 2017
642
When I need delays for milli and micro seconds Then I simple write following program
C:
int main(void)
{
   delay(1000); /* 1000 microseconds */
    delay(1);    /* 1 milliseconds */
}

void delay(unsigned int wait)
{
   unsigned int i;
   for (i = 0; i < wait; i++)
    {
      }
}
I work with 8051, Does this method provide faster delay for 8051. if not then ,How to make efficient delay ?
 

MrChips

Joined Oct 2, 2009
30,823
What do you mean by efficient?
A 1ms delay is 1ms.

Any software delay is a blocking delay, i.e. the code blocks the CPU from executing any other instructions.

If you want tighter code, you can use

while (--wait);

But that means that wait must be increased for the same time delay.
 

spinnaker

Joined Oct 29, 2009
7,830
When I need delays for milli and micro seconds Then I simple write following program
C:
int main(void)
{
   delay(1000); /* 1000 microseconds */
    delay(1);    /* 1 milliseconds */
}

void delay(unsigned int wait)
{
   unsigned int i;
   for (i = 0; i < wait; i++)
    {
      }
}
I work with 8051, Does this method provide faster delay for 8051. if not then ,How to make efficient delay ?
If you mean accurate then they only way to get an "accurate" delay is to do the delay in assembler and then it still won't be 100% accurate.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
What do you mean by efficient?
A 1ms delay is 1ms.
I think when loop run in 1000 times then it will take 1,085 micro-seconds because 8051 take 1.085 to execute one instruction. so when loop run 1000 time it take 1,085 us or 1.085 ms

If you want tighter code, you can use
while (--wait);
But that means that wait must be increased for the same time delay.
Delay program using while loop
C:
int main(void)
{
    delayUs(1000);

    return 0;
}

void delayUs(unsigned int wait)
   {
     while(wait--);
   }
What do you think about above program
 

MrChips

Joined Oct 2, 2009
30,823
Not bad, but change the data type from unsigned int to unsigned long.
This will allow for longer delays.

In doesn't matter how long the delay takes. At some point you have to calibrate it if you want exact 1ms or 1000ms delays, for example.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
Not bad, but change the data type from unsigned int to unsigned long.
This will allow for longer delays.

In doesn't matter how long the delay takes. At some point you have to calibrate it if you want exact 1ms or 1000ms delays, for example.
Thanks. I am stuck with following function
C:
/* functions for delay */

void delayUs(unsigned int wait)
   {
     wait >>= 3;
      while(wait--);
   }

void delayMs(unsigned int wait)
   {
      while(wait--)
      delayUs(1000);
   }
wait >>= 3 or wait = wait >> 3

For example:-
wait = 47; /* 47 is 00101111 in binary */

then wait = wait >> 3 = 0000 0101

x = 47; /* 47 is 00101111 in binary */

wait -- means the value in variable will decrements by one

What's happening in these two function
 

John P

Joined Oct 14, 2008
2,026
There's something a little Zen-like about a function that "does nothing, efficiently".

Do any of the experienced people here use a delay() function in any form? I've always had the instinct that it's really bad programming, to tie up the processor just to kill time. If the delay is more than a tiny amount, I'd always use a timer and an interrupt, and just about every project I do has a timer running and interrupting constantly. Generally it's the only interrupt I use, with everything else being polled. I say a fixed clock is the way to know exactly when everything may occur.

There's one type of delay I use, and that's when inserting a few assembly-language NOPs is enough to do the job. For instance, I recently had to bit-bang a serial port running at 115.2KB, and the loop that sent out the successive bits produced almost the right rate, so I put in a NOP here and there (partly to ensure a 0 and a 1 took the same amount of time), and I felt comfortable with that.

But then again, if "efficiency" is an objective, you'd want a processor that runs a useful task 100% of the time if it's running at all, with no wasted cycles. We don't often achieve that!
 

MrChips

Joined Oct 2, 2009
30,823
There's something a little Zen-like about a function that "does nothing, efficiently".

Do any of the experienced people here use a delay() function in any form? I've always had the instinct that it's really bad programming, to tie up the processor just to kill time. If the delay is more than a tiny amount, I'd always use a timer and an interrupt, and just about every project I do has a timer running and interrupting constantly. Generally it's the only interrupt I use, with everything else being polled. I say a fixed clock is the way to know exactly when everything may occur.

There's one type of delay I use, and that's when inserting a few assembly-language NOPs is enough to do the job. For instance, I recently had to bit-bang a serial port running at 115.2KB, and the loop that sent out the successive bits produced almost the right rate, so I put in a NOP here and there (partly to ensure a 0 and a 1 took the same amount of time), and I felt comfortable with that.

But then again, if "efficiency" is an objective, you'd want a processor that runs a useful task 100% of the time if it's running at all, with no wasted cycles. We don't often achieve that!
It depends on what you are trying to do.
If you just want to demonstrate a "blinky" LED, software delay is OK.
If you are running real-time process control, then software delay would be a poor choice.

I wrote phase-encoded one-wire digital transmission system for Microchip PICs and had to use clock cycle tallying and NOPs in order to get maximum throughput and consistent timing. You do what you gotta do.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
What's the wrong in program.I want's to print value of delayUs() from main function
C:
#include <stdio.h>

void delayUs(unsigned int wait)

int main(void)
{
    delayUs(10);

    printf("delayUs %d \n",delayUs());

    return 0;
}

void delayUs(unsigned int wait)
   {
     while(wait--);
   }
 
Last edited:

be80be

Joined Jul 5, 2008
2,072
delayUs you would never see it
It's too fast the number would roll off the screen faster then you could read most likely

Ian was trying to give you a better delay
wait >>=3;

You have to count the code in asm count each line it takes to get dead on delays most delays don't need to be dead on.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
Two things wrong:

delayUs( ) needs a input parameter..
what do you mean by input parameters

delayUs( ) does not return a value.
What about this
C:
#include <stdio.h>
void delayUs(unsigned int wait)
int main(void)
{
    delayUs(10);
    printf("delayUs %d \n",delayUs());
    return 0;
}
void delayUs(unsigned int wait)
   {
     while(wait--);
     return wait;
   }
Does it make any sense ?

delayUs you would never see it
It's too fast the number would roll off the screen faster then you could read most likely
Ian was trying to give you a better delay
wait >>=3;.
That's why I am trying to understand from basics. I have stopped to think about LCD and Now I am focusing on efficient delay only
 
Last edited:

Thread Starter

Parth786

Joined Jun 19, 2017
642
You still have not explained what you mean by efficient delay.
What are you trying to do?
I was trying learn programming for uart. I was trying to send character's from PC terminal to 8051 using uart and then want to display on LCD 16*02. There I need delay's for milli and micro seconds
This was My program
C:
int main(void)
{
   delay(1000); /* 1000 microseconds */
    delay(1);    /* 1 milliseconds */
}
void delay(unsigned int wait)
{
   unsigned int i;
   for (i = 0; i < wait; i++)
    {
      }
}
This was program for faster delay
C:
/* functions for delay */
void delayUs(unsigned int wait)
   {
     wait >>= 3;
      while(wait--);
   }
void delayMs(unsigned int wait)
   {
      while(wait--)
      delayUs(1000);
   }
I know while loop, decrements operator ,shift operator But I couldn't understand how this code is faster then mine
 

MrChips

Joined Oct 2, 2009
30,823
No, you do not need delay for milli or micro seconds to be very accurate.

You need to slow down your requests sufficiently so that you are not overwhelming the controller on the LCD.
What does efficiency have to do with the problem?

If you look at the timing specifications for the Hitachi HD44780 LCD Controller you will see the recommended delays required for each type of command. The longest delay will be required for CLEAR and HOME commands.
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
No, you do not need delay for milli or micro seconds to be very accurate.
.
Thanks. I think I have to study datasheet many times because there are so many information so I am reading datasheet

But now I want to clear some doubt's

1: as you can see we can create delay with both using for loop and while so when you need to create delay what do you use for or while ?

2: Please look at my program in post #12. Why program is not working ?
 

WBahn

Joined Mar 31, 2012
30,076
Thanks. I think I have to study datasheet many times because there are so many information so I am reading datasheet

But now I want to clear some doubt's

1: as you can see we can create delay with both using for loop and while so when you need to create delay what do you use for or while ?
What does it matter???

A for() loop is, in many compilers, syntactic sugar for a while() loop. Furthermore, the compiler will often translate either into a loop structure that can't be written in C (except by using goto statements) because the entry point will be in the middle of the loop. But so what?

2: Please look at my program in post #12. Why program is not working ?
How is it not working? What is it doing that it shouldn't or not doing that it should?

We are NOT mind readers!!!

One problem your code has is that your delay function is of type void (meaning you are promising the compiler that you won't return ANY value from the function) and then you proceed to return a value from the function. Decent compilers with produce an error or at least a warning. Less decent compilers will result in undefined behavior.
 

WBahn

Joined Mar 31, 2012
30,076
I was trying learn programming for uart. I was trying to send character's from PC terminal to 8051 using uart and then want to display on LCD 16*02. There I need delay's for milli and micro seconds
This was My program
C:
int main(void)
{
   delay(1000); /* 1000 microseconds */
    delay(1);    /* 1 milliseconds */
}
void delay(unsigned int wait)
{
   unsigned int i;
   for (i = 0; i < wait; i++)
    {
      }
}
This was program for faster delay
C:
/* functions for delay */
void delayUs(unsigned int wait)
   {
     wait >>= 3;
      while(wait--);
   }
void delayMs(unsigned int wait)
   {
      while(wait--)
      delayUs(1000);
   }
I know while loop, decrements operator ,shift operator But I couldn't understand how this code is faster then mine
Why is your delayUs() function dividing the input value by 8?
 

Thread Starter

Parth786

Joined Jun 19, 2017
642
What does it matter???
Exactly I wanted to know this. so found out that if need delay then I can use for or while its depend upon me what I want to use
How is it not working? What is it doing that it shouldn't or not doing that it should
C:
#include <stdio.h>

int main(void)
{
  int wait = 10;
 
while(wait)
   {

       wait--;
   
       printf("delay %d \n",wait);
   }

  return 0;

}
delay 9
delay 8
delay 7
delay 6
delay 5
delay 4
delay 3
delay 2
delay 1

I am trying print same output from main function
C:
#include <stdio.h>
void delayUs(unsigned int wait)
int main(void)
{
    delayUs(10);
    printf("delayUs %d \n",delayUs());
    return 0;
}
int delayUs(unsigned int wait)
   {
     while(wait--);
     return wait;
   }
compiler showing error
error: expected '=', ',', ';', 'asm' or '__attribute__' before '{' token
{
 

WBahn

Joined Mar 31, 2012
30,076
Even if you get the code to compiler and run, you are not going to get the same output. How could you? In the top program your printf() statement is contained in a loop. In the second statement, it is just a single statement in a sequence of two statements and so will only be executed once.

What line is that compiler error for. I'm not aware of any compiler that doesn't give SOME indication of WHERE in the program it encountered the error.

If it is complaining about the line with the printf() statement, consider that you are calling a function in that statement without any arguments but the contract you made with the compiler was that you would pass it a single argument of time unsigned int.

If it is complaining about the line with your prototype, then what has to be at the end of a function prototype to terminate the statement?

What is the only possible value that delayUS() can return? Hint: What has to be true regarding the value of the variable 'wait' in order for the function code to reach the return statement?
 
Top