LED Camping Lantern - How to determine amps required

Thread Starter

Lumenosity

Joined Mar 1, 2017
614
Hello,
I have a 750 lumen battery powered camping lantern.

It is normally powered by 4 D cell batteries.

But I'm using a larger, LiPo battery to power it sometimes and I have a DC-DC Buck Converter to step the power down from the 12.2v LiPo battery to the 6v required by the Lantern. Maximum current capacity of the Buck Converter is 12 amps.

This buck converter has an adjustment for CURRENT as well as voltage. How do I determine the current needs (in amps) of this lantern at maximum light output setting? Should I just set the Buck Converter to it's maximum of 12 amps and let the lantern draw whatever it needs?

Thanks
 

MrChips

Joined Oct 2, 2009
30,801
Any voltage and current adjustments on a power supply are the maximum limits.

Four D-cells would supply 4 x 1.5V = 6V.
Adjust the supply for 6V output. You may measure the current taken by the lantern at 6V using an ammeter in series with the lantern but this is only for your reference. You don't really need to know this.

Adjust the current downwards until the 6V output starts to fall below 6V. Adjust the current back upwards until the voltage stays at 6V.

Your current and voltage limits are now set properly for that lantern.
 
Hello,
I have a 750 lumen battery powered camping lantern.

It is normally powered by 4 D cell batteries.

But I'm using a larger, LiPo battery to power it sometimes and I have a DC-DC Buck Converter to step the power down from the 12.2v LiPo battery to the 6v required by the Lantern. Maximum current capacity of the Buck Converter is 12 amps.

This buck converter has an adjustment for CURRENT as well as voltage. How do I determine the current needs (in amps) of this lantern at maximum light output setting? Should I just set the Buck Converter to it's maximum of 12 amps and let the lantern draw whatever it needs?

Thanks
I think the easiest would be turn on the lamp using the D cell battery and measure the current draw using multimeter...this would be the most efficient ...because if your Lipo over supply the amp it could blow up or so much wasted energy
 

ebeowulf17

Joined Aug 12, 2014
3,307
I think the easiest would be turn on the lamp using the D cell battery and measure the current draw using multimeter...this would be the most efficient ...because if your Lipo over supply the amp it could blow up or so much wasted energy
As long as the voltage is stepped down to 6V, the lamp will draw whatever current it needs. The Lipo can't "over supply" amperage.
 
As long as the voltage is stepped down to 6V, the lamp will draw whatever current it needs. The Lipo can't "over supply" amperage.
In my experience of using Amp i think it can...unless the Lamp has a circuit to limit the amp...otherwise the amp just go in there and get heated up especially the cable and burnt. And that's why a phone charger that supply 5 volt has amp limit for the phone..for example 0.5 to 1 amp for phone charging and 1 to 2 amp for ipad..

If the amp go higher than that the phone will get heated up and burn or blow up.
 

Thread Starter

Lumenosity

Joined Mar 1, 2017
614
As long as the voltage is stepped down to 6V, the lamp will draw whatever current it needs. The Lipo can't "over supply" amperage.
This was my understanding.
And example might be a car battery which is "capable" of supplying a LOT of amps, but there are many bulbs and devices in the car that only use a few milliamps or FAR less than the battery can provide.

All these devices only draw what they need. Not what they battery CAN supply.

Somewhat like a water pump in a pond. Just because you add a LOT more water to the pond, the pump will still only draw what it can...no more.
 
This was my understanding.
And example might be a car battery which is "capable" of supplying a LOT of amps, but there are many bulbs and devices in the car that only use a few milliamps or FAR less than the battery can provide.

All these devices only draw what they need. Not what they battery CAN supply.

Somewhat like a water pump in a pond. Just because you add a LOT more water to the pond, the pump will still only draw what it can...no more.
I am quite skeptical on this although it is logically true but practically uncertain...because to my experience if you supply 5 volt to a phone and increase the amp to 4 or 5 i guess it will just blow up as it heated up.

There is reason why they limit the phone charging amp around 0.5 mA ~ 1 amp u can see it on the charger spec. If the statement above that you said is true...why not the phone company need to reduce the amp on the charger and do the electronic circuit to limit it. Perhaps some expert in this forum can enlighten me if i am wrong
 

ebeowulf17

Joined Aug 12, 2014
3,307
I am quite skeptical on this although it is logically true but practically uncertain...because to my experience if you supply 5 volt to a phone and increase the amp to 4 or 5 i guess it will just blow up as it heated up.

There is reason why they limit the phone charging amp around 0.5 mA ~ 1 amp u can see it on the charger spec. If the statement above that you said is true...why not the phone company need to reduce the amp on the charger and do the electronic circuit to limit it. Perhaps some expert in this forum can enlighten me if i am wrong
Battery charging circuits are a special case, and a rather complicated one at that. Light bulbs have their own current-limiting resistance. LEDs need an external resistor or other current limiting device to provide this, but that is still part of the typical LED circuit, not something that's defined by the power supply itself.
 

BobTPH

Joined Jun 5, 2013
8,952
I am quite skeptical on this although it is logically true but practically uncertain...because to my experience if you supply 5 volt to a phone and increase the amp to 4 or 5 i guess it will just blow up as it heated up.

There is reason why they limit the phone charging amp around 0.5 mA ~ 1 amp u can see it on the charger spec. If the statement above that you said is true...why not the phone company need to reduce the amp on the charger and do the electronic circuit to limit it. Perhaps some expert in this forum can enlighten me if i am wrong
That may be your understanding but it is incorrect.

If you have a power supply that can provide 100A at 5V you can still use it to charge a phone without damaging the phone.

I think you are confusing what would happen if you put 5V directly on the battery rather than putting it into the proper charging circuit that is built in to the phone.

Bob
 
That may be your understanding but it is incorrect.

If you have a power supply that can provide 100A at 5V you can still use it to charge a phone without damaging the phone.

I think you are confusing what would happen if you put 5V directly on the battery rather than putting it into the proper charging circuit that is built in to the phone.

Bob
Does it meant that if you supply 100A 5V power to 5 Volt 0.2 Amp light bulb it won't get heated up and blown off?
 

BobTPH

Joined Jun 5, 2013
8,952
What do you think you are doing when you plug a 30 Watt bulb into the same AC socket that can power a 1500 Watt space heater?

Bob
 
What do you think you are doing when you plug a 30 Watt bulb into the same AC socket that can power a 1500 Watt space heater?

Bob
This depends if the bulb has a circuit inside it will be fine i meant a circuit to match the bulb wattage ...but if direct i guess it will blown off
 

BobTPH

Joined Jun 5, 2013
8,952
I am talking about an incadescant bulb. It consists of a tungsten wire between the two terminals. There is no circuit to limit the current. Ohm's law does that.

Electric and electronic devices, with a few exceptions are designed to operate at a specific voltage. The amount of current the power supply can supply must be greater or equal to what the device uses. If the power source can supply more, it is not a problem. If it cannot supply enough, it is a problem which might damage the power source. Lithium batterries, for example, might explode.

Bob
 

Reloadron

Joined Jan 15, 2015
7,517
My house has 240 VAC 60 Hz 200 Amp service (split phase). My standard wall receptacles are on AWG 12 wire and 20 Amp circuit breakers providing 120 VAC 60 Hz power. I guess we could say that each branch circuit of 120 VAC 20 Amps is capable of providing 120 * 20 = 2400 Watts. When I plug in a 60, 75 or 100 watt incandescent lamp they simply light, the do not explode, burn up or do anything else out of the expected. My truck has a large 350 Amp Hour battery and my headlights do not explode or prematurely burn out, they only draw a few amps each, likewise for the radio and other accessories.

A lantern or any light source or for that matter anything like a radio or accessory will draw its rated current for a given voltage. Matters not what the current capability of the source is, as long as it exceeds the required current. Obviously if my source is 6 volts and my source current is limited to for example 2.0 Amps I can't place a 6 Volt 4.0 Amp load on my source. When demand exceeds what we have available something is going to give.

Ron
 

Thread Starter

Lumenosity

Joined Mar 1, 2017
614
Remember.....

Current provides the power.......but Voltage provides the "Push".....:)

"Don't over-push the power with too much voltage and you'll be ok"

Or so I've been told.....
 
Remember.....

Current provides the power.......but Voltage provides the "Push".....:)

"Don't over-push the power with too much voltage and you'll be ok"

Or so I've been told.....
Ok got it thanks all for the schooling..but what happen to battery charging why would they limit the amount of current.
 

ebeowulf17

Joined Aug 12, 2014
3,307
Ok got it thanks all for the schooling..but what happen to battery charging why would they limit the amount of current.
Batteries have relatively low internal resistance. So if you apply a high voltage to them, the current flowing into the battery is high, which can charge the battery too quickly and cause it to overheat. Because of this, proper battery charging systems limit the amount of current they'll provide when charging a battery.

When it comes to electronics projects, people often try to build their own simple charging systems, which can potentially be dangerous if they're not aware of all the rules and limitations.

Most professionally made charging systems have all the necessary safeguards and limits built in so that the user doesn't have to worry about anything.

As far as charging iPhones, it's a bit of a grey area. Apple refuses to say precisely what chargers work with what products and there's a lot of mythology on the internet around these issues. The iPhone definitely has some internal circuitry to stop the charging at the right state of charge, but it's not entirely clear to me whether or not the phone has internal current limiting for charging purposes or if it relies on the charger for that. This is way beyond my area of expertise so I won't speculate.

To sum up, I'm not really sure if it matters which charger you use for your phone, because the phone may have all the regulatory circuitry built in. I definitely do know that *something* has to regulate the amount current used to charge a battery, because the battery's own resistance wouldn't be enough to keep current down at safe levels.

In case it's not clear already, charging batteries is a very, very different scenario than lighting a lamp, or powering most other electronics devices.
 
Top