I understand a few things about timers in microcontrollers but I still don't fully understand timers. I would really appreciate any help understanding timers. I have spent about four to five hours on the internet to fully understand timers but still i am not getting things clearly.
Till now I understood that any microcontroller has a timer which is used to get the specific time. For example, if 12 ms is required, so timer can be configured for it.Timers can be of different size e.g. 8 bit timer, 16 bit time and 32 bit timer. Timer model has timer control register and timer register that also need to be configured
I am trying to understand how the specific time calculated for a timer. So what are main points we should remember to calculate any time period?
Does it make sense to calculate the time period?
Timer Mode : 8 bit timer
Clock Source : 5 MHz
Pre-scaler : 1/4
auto-reload : Yes
Timer Start : 100
How much specific time it will get?
Till now I understood that any microcontroller has a timer which is used to get the specific time. For example, if 12 ms is required, so timer can be configured for it.Timers can be of different size e.g. 8 bit timer, 16 bit time and 32 bit timer. Timer model has timer control register and timer register that also need to be configured
I am trying to understand how the specific time calculated for a timer. So what are main points we should remember to calculate any time period?
Does it make sense to calculate the time period?
Timer Mode : 8 bit timer
Clock Source : 5 MHz
Pre-scaler : 1/4
auto-reload : Yes
Timer Start : 100
How much specific time it will get?
Last edited: