Dimmer LED with shorter "on" time

Thread Starter

FroceMaster

Joined Jan 28, 2012
699
Hi,
Working on my project and just a little question.

I use a Microcontroller 16F*
Connected to some LED's
All runs in the program,. to some matrix, so the LED are on/off many times.
If i change the time the LED is on, will this give me "Dimmer" effect, ?
Ex now i use 5 ms on, for one group and 5 ms for next group and 5 ms for last group. and then repeat again and again.

If i then use 2 ms on, ect, will it then be less "light" in the LED ?

I know the time to run code will then be faster , but how else to make it ?

It's for a clock i want to dimmer at night.

rgds
Sten
 

Delta Prime

Joined Nov 15, 2019
1,311

ElectricSpidey

Joined Dec 2, 2017
2,757
If you reduce the on time and leave the off time the same you will dim the LED, but you will also increase the frequency.

If you wish to keep the frequency the same you increase the off time by the amount you reduce the on time.
 

Thread Starter

FroceMaster

Joined Jan 28, 2012
699
Have now cocked this up
I connect my MCU to control a TL494
Question is now, can i by sending 1-5 volts in control the DutyCycle, and there by fade the "load" ?
Or do i need a trimmer, instead. ?

Rgds.
tl494 test.png
 

Dodgydave

Joined Jun 22, 2012
11,284
You don't need a TL494, just alter the ON/OFF times as you stated, and use Mosfets to drive the leds,,.OR if it has an internal PWM controller use that.

Also Pin 14 is a 5V output reference, so you can't tie it to Ground.
 

djsfantasi

Joined Apr 11, 2010
9,156
  • Read the analog voltage on a pin. One way to generate a 1-5 volt voltage is to wire a 10KΩ pot as a voltage divider and connect the wiper to the analog pin you’re reading
  • Add 0.5 to the measured value and take the integer of the result. Gives you a value between 0 and 5.
  • Decide on a period. A refresh rate of 50Hz has a period of 20ms.
  • Use the calculated input value to map to a ms value representing an on time. You should get six values between 0 and the period (20 in this example). This represents an on time of 0% to 100%
  • Calculate an off time equal to period minus on time.
  • Turn on the LED for the on time
  • Turn off the LED for the off time
  • “Larger, rinse, repeat”
Use the microseconds (or milliseconds) function to determine the delay times, so you can use the delays to do other work while you’re waiting
 

Reloadron

Joined Jan 15, 2015
7,501
Hi,
Working on my project and just a little question.

I use a Microcontroller 16F*
Connected to some LED's
All runs in the program,. to some matrix, so the LED are on/off many times.
If i change the time the LED is on, will this give me "Dimmer" effect, ?
Ex now i use 5 ms on, for one group and 5 ms for next group and 5 ms for last group. and then repeat again and again.

If i then use 2 ms on, ect, will it then be less "light" in the LED ?

I know the time to run code will then be faster , but how else to make it ?

It's for a clock i want to dimmer at night.

rgds
Sten
OK, so you have a working System. Just modify your code. Saying 5 mSec on time says nothing because without knowing the off time there is no way to calculate the duty cycle. Would be nice if you posted your code and a schematic, a complete schematic, of what you currently have. What you have works so all you should need to do is reduce the duty cycle (dim) or increase the duty cycle (bright).

If you want dimmer at night and your uC has an analog in just map the analog in to a PWM out and use a LDR (Light Dependent Resistor) in a simple series circuit with a resistor to detect ambient light and output your LED PWM accordingly.

Ron
 
Last edited:
Top