i can use rtc but the problem is how to prove him that 10 hour delay is being produced. Evaluation might be completed in 1.5 hrs so how can i fast forward the timeAre you allowed to use an RTC? If yes, you can use a DS1307
https://www.theengineeringprojects.com/2019/04/introduction-to-ds1307.html
If no, you will have to configure the timers of the 8051.
yes i am thinking of using seven segment for thathi Cyfer,
Does the timer delay you have already made have a display which shows the state of the counter.?
E
You can’t fast forward time, but you can speed it up. Write the delay for a very slow clock rate, then run the clock 10 times faster for the demo.can use rtc but the problem is how to prove him that 10 hour delay is being produced. Evaluation might be completed in 1.5 hrs so how can i fast forward the time
#include <reg51.h> // Use the correct header file for 8051
// Segment codes for 0-9 for 7-segment display (common cathode)
const unsigned char segmentCodes[10] = {
0x3F, // 0
0x06, // 1
0x5B, // 2
0x4F, // 3
0x66, // 4
0x6D, // 5
0x7D, // 6
0x07, // 7
0x7F, // 8
0x6F // 9
};
// Timer 1 overflow delay function
void timer1Delay() {
TH1 = 0x00; // Set Timer 1 high byte for full count
TL1 = 0x00; // Set Timer 1 low byte for full count
TF1 = 0; // Clear overflow flag
TR1 = 1; // Start Timer 1
while (TF1 == 0); // Wait until overflow occurs
TR1 = 0; // Stop Timer 1
TF1 = 0; // Clear overflow flag again
}
// Function to display count on a 4-digit 7-segment display
void displayProgress(int count) {
int digit1, digit2, digit3, digit4;
int i, delay;
// Extract each digit from the count
digit1 = count % 10; // Units place
digit2 = (count / 10) % 10; // Tens place
digit3 = (count / 100) % 10; // Hundreds place
digit4 = (count / 1000) % 10; // Thousands place
// Multiplex each digit for a brief period
for (i = 0; i < 4; i++) {
switch (i) {
case 0:
P2 = segmentCodes[digit1]; // Send units digit pattern to port
P3 = 0x01; // Enable units place (digit 1)
break;
case 1:
P2 = segmentCodes[digit2]; // Send tens digit pattern to port
P3 = 0x02; // Enable tens place (digit 2)
break;
case 2:
P2 = segmentCodes[digit3]; // Send hundreds digit pattern to port
P3 = 0x04; // Enable hundreds place (digit 3)
break;
case 3:
P2 = segmentCodes[digit4]; // Send thousands digit pattern to port
P3 = 0x08; // Enable thousands place (digit 4)
break;
}
// Brief delay to hold the display
for (delay = 0; delay < 1000; delay++);
P3 = 0x00; // Turn off all digits before next iteration
}
}
void main() {
int overflowCount = 0; // Counts up to 1000 overflows
int progressCounter = 0; // Progress counter to display
TMOD = 0x10; // Set Timer 1 to Mode 1 (16-bit timer mode)
while (1) {
// Call timer delay function
timer1Delay();
overflowCount++;
// Check if overflowCount reached 1000 (approximately 71.58 seconds)
if (overflowCount >= 1000) {
overflowCount = 0; // Reset overflow counter
progressCounter++; // Increment progress counter
displayProgress(progressCounter); // Update 7-segment display
// If 600 increments have been reached, we’ve simulated 10 hours
if (progressCounter >= 600) {
progressCounter = 0; // Reset or take any required action
// Optionally add a signal here to indicate completion
}
}
}
}
i have written the code just above your reply can you look at it and tell is it correctYou don't need to spend 10 hours in the lab - all you need to do is spend the time representing the error in your timer. For example, if you are confident it will time 10 hours with max error +/- 10 minutes, you can set it going and come back in 9 hours 50 minutes, only need spend a maximum of 20 minutes waiting for it to time out to determine the actual error.
No I can't, but it is probably incorrect as it appears to be written in C, a language innately prone to inadvertant obfuscation..can you look at it and tell is it correct
A couple questions for you first:My prof gave me this project. I can produce the delay but how can i verify that because i don't have that 10 hour to spend in lab. Can you please help me with this.
XTAL frequency is 11.0592Mhz 8051 microcontroller and i have posted the code for my program just above in the thread if possible can you take a look and point out if any errors are thereA couple questions for you first:
1. How far off can it be and still be considered correct?
2. What kind of IDE are you working with?
I ask about the IDE because some IDE's let you simulate the working of the controller. That allows you to jump around long code lengths and still see the time accumulated on the timer in the IDE. In other words, if you have code that takes 1 minute to run normally you may jump to the instruction just after that 1 minute block of code. The IDE timer would still show 60 seconds even though it jumped in less than 1 millisecond. You can do this with longer code too. Maybe you have a 1 hour code block, once you jump over that the IDE timer would still show 3600 seconds (1 hour) even though it took less than a second to jump around.
They don't call it a jump though, it might just be a breakpoint.
You can set the clock speed to be faster too, then just multiply by the same amount. So if the IDE timer says 60 seconds and you increased the clock speed by 10 times, it would really have progressed by 600 seconds.
Another way is to make a shorter delay and then just call it a lot of times. If you make a delay that lasts for 60 seconds and you call it 60 times, that's 3600 seconds or 1 hour. If you change that delay to just 1 second, it would execute in 60 seconds, which would be a scaled version of your actual needed time length.
When you do this though you have to take into account the time it takes to call and return from the delay routine, unless you do not have to be that accurate. Ideally though you can get this total time to be exact to whatever spec the controller crystal is.
Depends on how accurate you want that 10 hours to be.My prof gave me this project. I can produce the delay but how can i verify that because i don't have that 10 hour to spend in lab. Can you please help me with this.