[Solved] How to save processor time

Thread Starter

rt694157

Joined Dec 15, 2019
78
When processor is running at 32 Mhz speed How much time it takes to execute one instruction. i think it would be in nano seconds so processor respond very fast as computer to mechanical device such as relay and switch. let say if device X takes 50 ms time to operate and device Y takes 60 ms time to respond and if processor control device X then it is wasting 50 ms.

Can it happen that the processor first sends the instruction to the device X and then immediately sends the instruction to the device Y and again send another instruction before responding to device X.

What I mean is that whatever time mechanical device take to respond The processor shouldn't waste its time It must use that time. When there will be a many tasks, then can the processor do this without affecting other tasks ?
 

MrChips

Joined Oct 2, 2009
30,795
What processor?
You have not told us the part number of the processor.

The processor does not waste any time if the code is written properly.
The program can send instruction to device X, do something else, send instruction to device Y, do something different.
In RTOS (Real Time Operating System) all tasks are executed in a schedule so that the processor is always doing something. This is called multi-tasking.
 

Thread Starter

rt694157

Joined Dec 15, 2019
78
What processor?
You have not told us the part number of the processor.
Haven't thought of it yet. I just consider for the example

The processor does not waste any time if the code is written properly.
The program can send instruction to device X, do something else, send instruction to device Y, do something different.
In RTOS (Real Time Operating System) all tasks are executed in a schedule so that the processor is always doing something. This is called multi-tasking.
If processor send instruction "turn on device X" but device X is taking 50 ms to turn on. so processor should send instruction "turn on to device Y" before 50 ms. When 50 ms complete it should check device X if its on the do something

I am trying to make useful example but I am not able to make it myself. How does processor execute all task without affecting any other tasks
 

MrChips

Joined Oct 2, 2009
30,795
A well designed OS (Operating System) will make use of interrupts.

Processor sends instruction to device X "TURN ON".
Processor does not wait for device X.

Device X sends interrupt to processor "I am turned on and READY to accept next instruction".
Processor is interrupted from whatever task it is doing, sends message to device X "OK", and makes a note to itself that device X is READY.
(All of this will take less than 1 micro-second.)

When processor is free to continue with using device X in its list of scheduled tasks, processor sends new instruction to device X.
 

BobTPH

Joined Jun 5, 2013
8,943
I am not disagreeing with anything anyone else has answered here, but, in reality, most micro applications are wasting nearly all of their time. This is not because of bad coding, it is because there is nothing for it to do most of the time!

Take, for example the project I just completed using two micros. The first one is controlling a strip of RGB LEDs in a cabinet. The other is a remote control for it using an RF link.

The remote runs on a battery, so it puts itself to sleep until a button is pressed to make the battery usage minimal when a button is not pressed. It then wakes up to send a serial message to the light controller and goes back to sleep.

The light controller does even worse. It uses an AC adapter, so I don’t bother to sleep. It spends most of its time waiting to see a long mark on the serial receiver. It then reads the message, validates it and does what it says. If the light pattern is a changing one, it also uses a timer interrupts to trigger the next change.

Lots of wasted time, but PIC micros are unlikely to be useful in bitcoin mining, so what else would I do with the time?

Bob
 

Thread Starter

rt694157

Joined Dec 15, 2019
78
Lots of wasted time, but PIC micros are unlikely to be useful in bitcoin mining, so what else would I do with the time?
In some cases we want to save processor time. for example when we are executing 1000 tasks with about 300 different priority. I think linear sequence will not useful
 

MrChips

Joined Oct 2, 2009
30,795
...in reality, most micro applications are wasting nearly all of their time.
Yes, that may be true depending on the application. However are they really "wasting time"?
In your first example, the processor goes to sleep.

In the second example, it could be scanning a video camera, playing music, projecting the time on a wall, whatever its application.
 

Thread Starter

rt694157

Joined Dec 15, 2019
78
A well designed OS (Operating System) will make use of interrupts.
Device X takes 50 ms , Device Y takes 60 ms and Device Z takes 70 ms

Loop start
Device X;
Device Y;
Device Z;
Loop End

Interruput 1 : control Device X
Interruput 2 : control Device Y
Interruput 3 : control Device X

If I want to schedule all three tasks. Will the first interrupt generate at 50 ms , second interrupt at 60 ms and third interrupt at 70 ms?
 

BobaMosfet

Joined Jul 1, 2009
2,113
When processor is running at 32 Mhz speed How much time it takes to execute one instruction. i think it would be in nano seconds so processor respond very fast as computer to mechanical device such as relay and switch. let say if device X takes 50 ms time to operate and device Y takes 60 ms time to respond and if processor control device X then it is wasting 50 ms.

Can it happen that the processor first sends the instruction to the device X and then immediately sends the instruction to the device Y and again send another instruction before responding to device X.

What I mean is that whatever time mechanical device take to respond The processor shouldn't waste its time It must use that time. When there will be a many tasks, then can the processor do this without affecting other tasks ?
Just assuming 1 processer (1 core. not multithreaded). If you want to know how much time a processor takes per instruction, you have to know a little bit about the processor. If it uses pipelining, or prefetching, it's possible it can execute 2 or more assembly language instructions within a single clock cycle. A clock cycle is defined as:

1/CPU speed.

1/32768000 = 0.00000003051 seconds per heartbeat aka clock cycle. Or 30.5ns. So, depending on architecture, you're going to average slightly less or slightly more instruction executions per second than clock-cycles as a very crude rule of thumb. It's important to understand this because it will help you hone your routines so as to work comfortably in interrupt-driven environments without hogging time.

A processor, in your example, doesn't waste time on controlling a relay. It simply sets a pin high, and then moves on. It can then come back later with (and using a feedback loop, or worst case a simple timer) to determine when to turn that signal off.

You need to understand the difference between what a PROCESSOR is doing, and what PROGRAM LOGIC is doing. Processors are interrupt driven devices. This is why good program logic doesn't interfere with that process and takes advantage of it, to give the illusion that a single processor is doing simultaneous things.
 

sagor

Joined Mar 10, 2019
909
Even without external interrupts, you can still do timing of events within the processor. Say you know for a fact that device X takes 50ms to do its job. Then your code could be like this:
Zero ms counter for device X
Start 1ms timer (optional interrupt routine that is independent of Devices)
Start X
Loop:
check if X timer count >= 50 (or check timer status and increment counter)
If so, process whatever you want to do with X
If not, Start Y

Loop back to some logic that checks either the X timer or some other status.

1ms interrupt routine simply adds 1ms count to counter after every trigger. Main program clears counter or stops timer when necessary.
Timer interrupts are not necessary either, one can also check the timer status within the main program loop, to see if timer has reached 1ms, then reset timer counters and increment X count. Depending on timer design, you may be able to get away with longer timer, like 10ms, or be limited to shorter timers - all depends on designs of microprocessor.
 
Top