How do you output current?

Discussion in 'General Electronics Chat' started by Guest3123, Feb 7, 2016.

  1. Guest3123

    Thread Starter Member

    Oct 28, 2014
    312
    18
    That's current, not voltage. There's a difference.

    How do you output current..? I know the cellphone will only take what it needs to charge, but let's say I wanted to charge batteries for an example. How would you output let's say 1 Amp ?

    So a 26650 or 18650 battery fully charged voltage is 4.2 volts, or pretty damn close to it. Let's say I wanted to charge my high drain, high current 26650 batteries at let's say 1 Amp.. My Efest charger does this already, in fact, it can charge at 1/2 Amp, 1 Amp, and even 2 Amps. I've owned the charger for well over two years or so.

    Is it just a resistor that does that? Kinda like how you limit the current with a resistor for an LED ?

    How much voltage do you charge a 26650 battery?
    How much voltage do you charge a 12v car battery?

    Ok, I'm going to take a guess, let it be wrong or correct, using ohms law, 5v, 5Ω, 1A, 5 Watts. So I'd need 5 Ohms to create 1 amp of current going to the battery? That is.. if I'm supposed to use 5 volts to charge a 4.2v 26650 battery. No it's not 3.7v it's 4.2v fully charged.

    Anyone care help with this question(s) ?
     
  2. Dodgydave

    Distinguished Member

    Jun 22, 2012
    4,969
    744
    Two types of chargers, Constant Current and Constant Voltage, you can only use one at a time, so for nimhi, nicads or 18650 batteries constant current is best, for a car battery constant voltage.

    You can use a deadicated regulator chip that is designed for that purpose, or use a Lm338, LM317 regulators.
     
  3. Guest3123

    Thread Starter Member

    Oct 28, 2014
    312
    18
    So constant current is best..

    You would have to use a chip (LM338, or LM317) ?
    LM338 5-Amp Adjustable Regulator
    http://www.ti.com/lit/ds/symlink/lm338.pdf
    Hunted000629.jpg

    So it's not like this then?
    Hunted000628.jpg
     
    Last edited: Feb 7, 2016
  4. MaxHeadRoom

    Expert

    Jul 18, 2013
    10,509
    2,369
    Guest3123 likes this.
  5. ISB123

    Well-Known Member

    May 21, 2014
    1,239
    527
    Using a resistor still provides constant current if original conditions don't change. In case of voltage spike/drop current will increase/decrease and resistance changes with temperature.
     
  6. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,516
    1,246
    It can be, but that is not optimal. For a constant voltage source and whatever the battery terminal voltage is, current through the resistor is (Vs-Vb)/R. So you can set the charging current for any particular set of conditions. but as the battery charges up, its terminal voltage increases and thus the charging current decreases. So a fixed resistor can set the peak charging current, but not hold that current throughout the charge cycle.

    A constant current source is the name of a large class of regulator circuits. As the name says, the primary feature is maintaining a constant current through what could be a varying load. It does this by adjusting the output voltage such that I=E/R is held constant as R varies. Depending on the requirements, it can be very simple. If you have a 1000 Vdc source, a 1 Megohm resistor draws 1 mA of current. If you increase or decrease the resistor by 1000 ohms, the resulting current varies by only 0.1%. So a 1 kV source with a 1 megohm series resistor makes a pretty good constant current source. Constant enough? It depends.

    All LM3xx datasheets have a constant current circuit using the part as both the pass element and the current-sensing regulator.

    ak
     
  7. #12

    Expert

    Nov 30, 2010
    16,257
    6,757
    You need to go peek at batteryuniversity.com

    Now..for every constant voltage or current, there is the question, "How constant?"
    A regulator chip can do very well at either of those, but I can make a constant current controller or a constant voltage controller out of a couple of transistors.

    In the olden days, it was said that a very large voltage in series with a resistor caused a constant current.
    For example, your battery might have a voltage of 3.7 volts or 4.2 volts. If I started with 100 volts and a resistor of 100 ohms, your battery would receive .958 amps to .963 amps.
    Both those numbers match each other within about 1/2%. That looks pretty constant to me.
    Same goes for a voltage regulator chip. You have to look up a good one to get a promise of 1/2% accuracy.
    It seems everything is a compromise. I can do, "dead on" but why bother? The battery doesn't care about a half of a percent.

    Edit: Came in second to AK.:mad:
     
  8. Lestraveled

    Well-Known Member

    May 19, 2014
    1,957
    1,215
    @Guest3123
    There are circuits that "kind of" give you a current output, like just putting a resistor in series with a voltage source. The current will not be constant and the voltage will not be constant either. In other words it is in between a constant voltage source and a constant current source. You generally get what you pay for. You put a 50 cent resistor in series with your power supply and you get a battery charger worth about 50 cents.

    The larger the battery and the more complex battery chemistry, the more complex the charger needs to be. The opposite is also true. For instance, a cordless hand vac that uses Nickel metal Hydride batteries can use a 50 cent charger circuit because the batteries are are very over charge tolerant and don't require fast changing. On the other hand, a 10 AH LiPo battery pack, that needs to be recharged quickly is likely to rupture if it is over charged.

    All batteries have a "charging profile" or a charging procedure that produces the highest performance and longest life out of that battery. They are normally a mixture of constant current and constant voltage modes.

    Google how to charge the kind of battery you are using and get as close to that profile as you can.
     
  9. hp1729

    Well-Known Member

    Nov 23, 2015
    1,944
    219
    ??? Current is not pushed out by choice. It is drawn out and determined by resistance of the load. You can make more current available but you can't make it flow out of the power supply..
     
  10. wayneh

    Expert

    Sep 9, 2010
    12,093
    3,032
    Well, you sort of can with a constant current supply. It has a max voltage and can't drive the desired current if that max voltage isn't high enough, for instance if there is no load. But if there's a load, the supply is in control of current, not the load.
     
  11. hp1729

    Well-Known Member

    Nov 23, 2015
    1,944
    219
    Does it really? If we have a constant current source of 100 mA and a 100K ohm load we won't get 100 mA unless we have the voltage available to do that.
     
  12. wayneh

    Expert

    Sep 9, 2010
    12,093
    3,032
    It will keep trying until the max voltage is reached. If the load resistance is too high, the current falls off.

    I was just pointing out that, while operating in range, the supply is in control.
     
  13. hp1729

    Well-Known Member

    Nov 23, 2015
    1,944
    219
    True.
     
  14. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,516
    1,246
    <Obvious rejoinder spared>

    My example came from a Tektronix recruiting question back in the 70's.

    ak
     
Loading...