Hi,
Working on my project and just a little question.
I use a Microcontroller 16F*
Connected to some LED's
All runs in the program,. to some matrix, so the LED are on/off many times.
If i change the time the LED is on, will this give me "Dimmer" effect, ?
Ex now i use 5 ms on, for one group and 5 ms for next group and 5 ms for last group. and then repeat again and again.
If i then use 2 ms on, ect, will it then be less "light" in the LED ?
I know the time to run code will then be faster , but how else to make it ?
It's for a clock i want to dimmer at night.
rgds
Sten
Working on my project and just a little question.
I use a Microcontroller 16F*
Connected to some LED's
All runs in the program,. to some matrix, so the LED are on/off many times.
If i change the time the LED is on, will this give me "Dimmer" effect, ?
Ex now i use 5 ms on, for one group and 5 ms for next group and 5 ms for last group. and then repeat again and again.
If i then use 2 ms on, ect, will it then be less "light" in the LED ?
I know the time to run code will then be faster , but how else to make it ?
It's for a clock i want to dimmer at night.
rgds
Sten