Mobile power supply

Discussion in 'General Electronics Chat' started by kate225633, Dec 26, 2008.

  1. kate225633

    Thread Starter New Member

    Dec 26, 2008
    Hi, I am currently in the process of doing a school project on circuits.

    What I am about to do is very simple, but the only problem is I don't know how to go about doing it.

    I need to light a 12V light bulb using a battery. However the length of the wire which connects the battery to the bulb has to be 100 metres copper wire (diameter 1mm).

    How do I go about making this circuit bearing in mind that it must work out in the field where there is no electricity supply (only batteries).


  2. m.majid

    Active Member

    May 28, 2008
    if your light bulb is 1 or 2 watt (consumes 100 to 200 mA), then no problem, just do it!
    but if it consumes more power, 1 mm dia wire is not adequate for 100 meter distance, because the resistance of wire will drop the efficient voltage, so you must use thicker wire.
    Last edited: Dec 26, 2008
  3. SgtWookie


    Jul 17, 2007
    1mm copper wire has a resistance of 2.1 Ohms per 100 meters.
    If your 12v lamp is to be 100 meters away from the battery, then you will need 200 meters of wire; one wire for the +12v supply, and one for the return.

    This results in your total wire resistance being 4.2 Ohms.

    Light bulbs are usually rated by voltage and power in Watts.
    You can determine the current required by dividing Watts/Voltage. This is a part of Ohm's Law; I = P/E (Current in Amperes = Power in Watts/Voltage)

    If you had a 3 Watt bulb, then at 12v, your current through the bulb would be 0.25 Amperes.
    However, you also have that wire resistance to think about, which will cause a voltage drop.
    E=IR, or Voltage = Current * Resistance (another variation of Ohm's Law)
    We know the current is 0.25A, and the wire resistance is 4.2.
    So, 0.25 * 4.2 = 1.05 Volts. Your 3 Watt lamp, designed to give full brightness at 12v, will only be getting 10.95v.

    The higher the wattage of the lamp, the worse the problem gets. As the lamp wattage increases, current in the circuit increases, which causes more and more voltage to be dropped across the 200 meters of wire.
  4. jpanhalt

    AAC Fanatic!

    Jan 18, 2008
    The answers based on DC are a practical solution; however, since it is a school project, have you considered converting to a higher voltage AC at the battery. Then down converting to low voltage AC at the bulb. That way, you could maintain full brightness.

  5. kate225633

    Thread Starter New Member

    Dec 26, 2008
    Hi guys, first of all thank you for replying to this post.

    SgtWookie, I have decided to change the bulb I am using. I now have a choice of two bulb types, the first one is 2.5V 0.3A and 0.75W and the second is 2.4V, 0.5A and 1.2W.

    According to my calculations using the above formula you provided E=IR, for the first bulb E=0.3 * 4.2 = 1.26V and for the second its 0.5 * 4.2 = 2.1V

    Therefore am I right is believing that I need a 3.76V power source for the first bulb to work and a 4.5V power source for the second bulb to work? This much voltage is pretty easy to get from AA batteries in series.

    Thanks again for your help.
  6. Audioguru

    New Member

    Dec 20, 2007
    The voltage of each AA alkaline battery cell drops quickly at such high currents:
  7. kate225633

    Thread Starter New Member

    Dec 26, 2008
    I didn't realise that 0.3A was a high current! Also I only need it to work for about 5 minutes, so not a long time at all.