Hi All
I have a project I am working on at the moment using a PC application to send through what is essentially 'bit sequences' (via a serial link) to a PIC16F877A microcontroller (running at 10Mhz).
Currently I have the project to the point where the microcontroller recieves my serial string and creates a certain bit pattern on its outputs.
Currently I have the PC application set up so that it sends say 0001001 then waits say 1 second (as instructed) before sending the next bit pattern in its sequence.
This works greak however I was asked to add another capability to the setup.
When any bit is turned on at the recieving microcontroller end it remains on for approx 1 second after its 'turn on' instruction.
Simple you say. Yes, My problem arises when you send through another sequence say only 0.3 sec later and the other bit is still timing.
My idea is to run an internal timer/counter set to approx 0.1 second. When a bit is instructed to turn on it sets a corresponding element in an array to 10, then any time the timer counts it decrements the count in each element of the array and then the Microcontroller outputs an on bit where the corresponding element is greater than zero.
I guess my biggest query is if you a listening for serial data with a simple loop and also doing an operation with a hardware timer/counter, will you miss some/many incoming serial sequences ?(because a hardware timer/counter acts like an interrupt correct?)
My application would require the 1 second 'On time' to not be precise however the serial strings may need to be only 0.1 second appart (never less and most often more)
Your ideas would be greatly welcomed I have no experience as the whether hardware timers and serial comms play nice together.
Regards
Bruce
I have a project I am working on at the moment using a PC application to send through what is essentially 'bit sequences' (via a serial link) to a PIC16F877A microcontroller (running at 10Mhz).
Currently I have the project to the point where the microcontroller recieves my serial string and creates a certain bit pattern on its outputs.
Currently I have the PC application set up so that it sends say 0001001 then waits say 1 second (as instructed) before sending the next bit pattern in its sequence.
This works greak however I was asked to add another capability to the setup.
When any bit is turned on at the recieving microcontroller end it remains on for approx 1 second after its 'turn on' instruction.
Simple you say. Yes, My problem arises when you send through another sequence say only 0.3 sec later and the other bit is still timing.
My idea is to run an internal timer/counter set to approx 0.1 second. When a bit is instructed to turn on it sets a corresponding element in an array to 10, then any time the timer counts it decrements the count in each element of the array and then the Microcontroller outputs an on bit where the corresponding element is greater than zero.
I guess my biggest query is if you a listening for serial data with a simple loop and also doing an operation with a hardware timer/counter, will you miss some/many incoming serial sequences ?(because a hardware timer/counter acts like an interrupt correct?)
My application would require the 1 second 'On time' to not be precise however the serial strings may need to be only 0.1 second appart (never less and most often more)
Your ideas would be greatly welcomed I have no experience as the whether hardware timers and serial comms play nice together.
Regards
Bruce