Mobile power supply

Thread Starter

kate225633

Joined Dec 26, 2008
3
Hi, I am currently in the process of doing a school project on circuits.

What I am about to do is very simple, but the only problem is I don't know how to go about doing it.

I need to light a 12V light bulb using a battery. However the length of the wire which connects the battery to the bulb has to be 100 metres copper wire (diameter 1mm).

How do I go about making this circuit bearing in mind that it must work out in the field where there is no electricity supply (only batteries).

Thanks.

Kate
 

m.majid

Joined May 28, 2008
53
Hi, I am currently in the process of doing a school project on circuits.

What I am about to do is very simple, but the only problem is I don't know how to go about doing it.
...
if your light bulb is 1 or 2 watt (consumes 100 to 200 mA), then no problem, just do it!
but if it consumes more power, 1 mm dia wire is not adequate for 100 meter distance, because the resistance of wire will drop the efficient voltage, so you must use thicker wire.
 
Last edited:

SgtWookie

Joined Jul 17, 2007
22,230
1mm copper wire has a resistance of 2.1 Ohms per 100 meters.
If your 12v lamp is to be 100 meters away from the battery, then you will need 200 meters of wire; one wire for the +12v supply, and one for the return.

This results in your total wire resistance being 4.2 Ohms.

Light bulbs are usually rated by voltage and power in Watts.
You can determine the current required by dividing Watts/Voltage. This is a part of Ohm's Law; I = P/E (Current in Amperes = Power in Watts/Voltage)

If you had a 3 Watt bulb, then at 12v, your current through the bulb would be 0.25 Amperes.
However, you also have that wire resistance to think about, which will cause a voltage drop.
E=IR, or Voltage = Current * Resistance (another variation of Ohm's Law)
We know the current is 0.25A, and the wire resistance is 4.2.
So, 0.25 * 4.2 = 1.05 Volts. Your 3 Watt lamp, designed to give full brightness at 12v, will only be getting 10.95v.

The higher the wattage of the lamp, the worse the problem gets. As the lamp wattage increases, current in the circuit increases, which causes more and more voltage to be dropped across the 200 meters of wire.
 

jpanhalt

Joined Jan 18, 2008
11,087
The answers based on DC are a practical solution; however, since it is a school project, have you considered converting to a higher voltage AC at the battery. Then down converting to low voltage AC at the bulb. That way, you could maintain full brightness.

John
 

Thread Starter

kate225633

Joined Dec 26, 2008
3
Hi guys, first of all thank you for replying to this post.

SgtWookie, I have decided to change the bulb I am using. I now have a choice of two bulb types, the first one is 2.5V 0.3A and 0.75W and the second is 2.4V, 0.5A and 1.2W.

According to my calculations using the above formula you provided E=IR, for the first bulb E=0.3 * 4.2 = 1.26V and for the second its 0.5 * 4.2 = 2.1V

Therefore am I right is believing that I need a 3.76V power source for the first bulb to work and a 4.5V power source for the second bulb to work? This much voltage is pretty easy to get from AA batteries in series.

Thanks again for your help.
 
Top