LED power consumption

Ya’akov

Joined Jan 27, 2019
9,165
I’m sorry, I got off on the wrong foot with this thread. I apologize for seeming so harsh, it’s my fault. Partly because of some unwarranted assumptions I made and partly because your initial question described something impractical and you said, “I know there are problems with what I want to do but I am going to do it anyway” without explaining why those problems didn’t matter.

Of course, you can connect and power your LEDs any way you want, they are your LEDs. It’s just that so many times people go though a bunch of detailed help and it elvotes that they needed something completely different which would have been clear from the outset had they described the problem rat her than their proposed solution.

I sincerely apologize if I‘ve offended you. I’ll just move on.
 

Thread Starter

christiannielsen

Joined Jun 30, 2019
387
I gave up my questions long ago when someone started discussing what my LED's needed of current, putting words into my mouth and how my eyes react on brightness.

So I started testing and learning more by making this setup with a actual power supply. Thought I would measure and calculate to learn but then you wrote my measurements are incidental.

So my question is how can I do a realistic test setup to choose from if my measurements are incidental? I get that most LED are rated at 20 mA but these I have apparently illuminates at 2.6 v at 0.34 mA. or is that an incidental measurement?
 

Thread Starter

christiannielsen

Joined Jun 30, 2019
387
If your LEDs take 20mA at 3.0V why are you using 20mA as the current at 2.6V? It might be as little as 2mA at 2.6V, which makes my calculation in post #8 way wrong. If you intend to run them at 2.6V you need to measure that current and redo the calculation.

Bob
This is what I think I did with this setup. But someone tells me my measurement won't work?
9 leds with a Vf of 2.6 volt per LED and If of 0.34mA.
 

Ya’akov

Joined Jan 27, 2019
9,165
I gave up my questions long ago when someone started discussing what my LED's needed of current, putting words into my mouth and how my eyes react on brightness.

So I started testing and learning more by making this setup with a actual power supply. Thought I would measure and calculate to learn but then you wrote my measurements are incidental.

So my question is how can I do a realistic test setup to choose from if my measurements are incidental? I get that most LED are rated at 20 mA but these I have apparently illuminates at 2.6 v at 0.34 mA. or is that an incidental measurement?
OK, here’s the best help I can offer. I still have no idea what you will be using the LEDs for, so some part of this is speculative. In particular, you never have to run LEDs at their current rating. You can run them lower and extend their life or reduce their output. You can run them higher (within limits) and get more ouptut for a shorter life.

To decide you have to choose the light output you need. This will depend on the application. If I assume that your (literal) eyeballing of the output is correct then…

Power consumption for a given voltage is simple. The measured current at the chosen voltage is simply multiplied by it and you have Watts. But, in your case, I think for the number i believe you wantthat you really just need to multiply the consumption of each LED at the voltage by the number of LEDs to get total current.

That total current times the chosen forward voltage will give you the overall power requirement regardless of the ultimate supply voltage. There will be losses so it will have to be somewhat higher in practice but otherwise it will give you a good number for the design.
 

Audioguru again

Joined Oct 21, 2019
6,692
Is your multimeter designed to test the forward voltage of LEDs at 20mA or is it designed to see if an ordinary diode works at a very low current of maybe 0,3mA?
My multimeter uses a very low current on its "diode test" which barely lights an LED, then its forward voltage measurement is lower than at 20mA.
 
Last edited:

Thread Starter

christiannielsen

Joined Jun 30, 2019
387
I tried to straight this out in #19 where you first started using 20mA instead of the actual measurement. I even tested to be sure.
Yeah, as I wrote I was only trying to learn the calculations. The actual numbers wasn't crucial for me. Also you said the direct opposite of audioguru who didnt believe my led's would illuminate at anything under 3 volt 20mA. That confused me. And when you started writing my measurements were incidental (not valid?) I had no idea what to go with.

OK, here’s the best help I can offer. I still have no idea what you will be using the LEDs for, so some part of this is speculative. In particular, you never have to run LEDs at their current rating. You can run them lower and extend their life or reduce their output. You can run them higher (within limits) and get more ouptut for a shorter life.

To decide you have to choose the light output you need. This will depend on the application. If I assume that your (literal) eyeballing of the output is correct then…

Power consumption for a given voltage is simple. The measured current at the chosen voltage is simply multiplied by it and you have Watts. But, in your case, I think for the number i believe you wantthat you really just need to multiply the consumption of each LED at the voltage by the number of LEDs to get total current.

That total current times the chosen forward voltage will give you the overall power requirement regardless of the ultimate supply voltage. There will be losses so it will have to be somewhat higher in practice but otherwise it will give you a good number for the design.
Thanks for clearing that up for me. Now I know I can run LEDs at 2.6volt at only 0.34 mA. Then why would I run them at 3 volts 20 mA (or even 2 mA). It would be a total waste of power. right?

There never needed to be a project for you to know of. If I could have made my questions understandable for you guys, then I simply needed a yes or no to my questions.
 

Thread Starter

christiannielsen

Joined Jun 30, 2019
387
Is your multimeter designed to test the forward voltage of LEDs at 20mA or is it designed to see if an ordinary diode works at a very low current of maybe 0,3mA?
My multimeter uses a very low current on its "diode test" which barely lights an LED, then its forward voltage measurement is lower than at 20mA.
I didn't use a specific LED test mode. I measured the If in the circuit I mentioned in #29.
 

Ya’akov

Joined Jan 27, 2019
9,165
Yeah, as I wrote I was only trying to learn the calculations. The actual numbers wasn't crucial for me. Also you said the direct opposite of audioguru who didnt believe my led's would illuminate at anything under 3 volt 20mA. That confused me. And when you started writing my measurements were incidental (not valid?) I had no idea what to go with.



Thanks for clearing that up for me. Now I know I can run LEDs at 2.6volt at only 0.34 mA. Then why would I run them at 3 volts 20 mA (or even 2 mA). It would be a total waste of power. right?

There never needed to be a project for you to know of. If I could have made my questions understandable for you guys, then I simply needed a yes or no to my questions.
You need to scale the current to the light output requirements. There is no obvious reason to make 10s of LEDs "just light up".

Anyway, if your question has been answered, that's great. But consider next time explaining the problem you are trying to solve so the people trying to help can understand what you are trying to do and not make all sorts of guesses about things.

The problem being solved is important for people to help you even if you don't think it is.
 

BobTPH

Joined Jun 5, 2013
8,967
If you are happy with the brightness at 2.6V and 0.34 mA then that is fine. I do not find it surprising that it would take only 2.6V and 0.34 mA to light visibly.

The reason people are confused is because you would not tell us what you were trying to do. Many assumed you were trying to use these as a light source, rather than just to be visible.

And we still don’t know under what conditions they need to be visible. I think you would find that they are not visible at all in bright sunlight.

This is why holding back information gets you bad answers.

Bob
 
I measured the current but apparently that measurement wasn't valid and I don't get why.
Is something wrong with my multimeter? because my multimeter doesn't not know what it says in the datasheet of my LEDs's. My guess was that the multimeter measures how much power is running through it no matter what I hook it up with? dont get it.

The DMM ammeter inserts a resistor. The voltage drop is USUALLY <0.6V because that's easy to protect to. If you use a DMM, you need two DVM's. One to look at voltage and the other current. It's best to use a shunt resistor that you knw won;t affect the output. You probaby don't own a feedback ammeter. I need a set of $400.00 cables for the one that will measure up to 1 amp. My other one measures to 2 mA. The latter has a maximum 200uV voltage drop.
 
Top