newbie question

Discussion in 'General Electronics Chat' started by Senz_90, Sep 26, 2013.

  1. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    i just reading an analogy for Ohm's Law about current and voltage, but i still didn't understand clearly "With resistance steady, current follows voltage (an increase in voltage means an increase in current, and vice versa)". it is say that when voltage increase so the current increase too with resistance steady.

    i means that in practically. for ex. i have a PSU dual (triple maybe) output 12V=5A , 5V=10A, 3V = 20A.

    is this means when voltage decrease current just increase? or is that just a specifications?
     
  2. Dodgydave

    Distinguished Member

    Jun 22, 2012
    4,969
    744
    The current depends on the voltage applied across the load, so you can have 5v across 1 ohms and draw 5 amps, or 20volts across 4ohms and draw 5amps. The psu gives out only what is needed upto its max.
     
  3. shortbus

    AAC Fanatic!

    Sep 30, 2009
    4,004
    1,523
    I'm assuming from the voltages this power supply is from a computer. This is a bad example for your question. Each of those voltages are generated/made by separate circuits with in the PSU, not one circuit at different levels.
     
  4. Brownout

    Well-Known Member

    Jan 10, 2012
    2,375
    998
    That means the 12V is capable of supplying 5A, the 5V can supply 10A, and the 3V can supply 20A. It has nothing whatsoever to do with ohm's law.

    But let's look at the power output for each:

    12V*5A=60W
    5V*10A=50W
    3V*20A=60W.

    See where this is going? The three supplies output about the same power. So, in order for that to happen, the current goes up as the voltage goes down.
     
  5. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    ok. i got it. maybe this example can use to. if i have adjustable power supply with 500mA transformer, without using math for voltage drop of diode the Amp increase when i am increasing the voltage?? we can say it has LM317 ic and voltage range 1,2-30 voltage for example. when the voltage 3V how can i calculate the current?
     
  6. DerStrom8

    Well-Known Member

    Feb 20, 2011
    2,428
    1,328
    In order to calculate the current from the voltage, you need to know the resistance.

    Use Ohm's law:

    V = I * R

    Or in your case,

    I = V / R

    Notice I just rearranged the values. I is current, V is voltage, and R is resistance. So in order to find the current, simply divide the voltage (3v) by the resistance in the circuit powered by the LM317, and you'll get the current.

    Matt
     
  7. Brownout

    Well-Known Member

    Jan 10, 2012
    2,375
    998
    If you'r using a 500mA transformer, and a linear regulator to knock down the voltage, you can use a maximum of 500mA for your supply, regardless of the final voltage from the regulator. Also, the maximum power dissipation of your regulator may impose even more restriction on your output current.
     
  8. WBahn

    Moderator

    Mar 31, 2012
    17,716
    4,788
    Keep in mind that Ohm's law is a very specific relationship. It describes the relationship between the voltage across a resistor to the current through that same resistor. That's it. Nothing more. A power supply is not a resistor, so Ohm's law doesn't apply.
     
    anhnha likes this.
  9. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    ok i got it.. so it just useful when we measure on circuit have the component likes resistor...?
     
  10. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    ok. so it is just affected when we have a resistance to measure..i got it. thank you

    so the current stable on 500mA whatever the voltage output. it is your means? right? the power must dissipate by regulator is higher when i settle my linear adjustment to the higher voltage?? right?
     
  11. DerStrom8

    Well-Known Member

    Feb 20, 2011
    2,428
    1,328
    It's affected when you have a complete circuit. When you have a complete circuit, you will have a voltage difference across an element, and that allows current to flow. The element can be an actual resistor, or just a wire (which will have a very low resistance, but will be a resistor nonetheless). You generally want to avoid extremely low resistances (short circuits) as it will try to draw too much current from the source and possibly burn it out.
     
  12. Brownout

    Well-Known Member

    Jan 10, 2012
    2,375
    998
    No, I'm saying that if you have a 500mA transformer, the most current you can use is 500mA, no matter what voltage you have at the output of your regulator. The transformer voltage AND current will be the same.
     
  13. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    ok. i understand. yeah i know the wire has a very little resistance.. when we use our PSU with high current across a little resistance of course it makes everything hot and burn just like a method to find short circuit :D


    ohh.. i got it.. so no matter what voltage, it has a 500mA current flow through it.. thank you..
     
  14. DerStrom8

    Well-Known Member

    Feb 20, 2011
    2,428
    1,328
    No, that is not what he's saying. If you have a constant resistance, increased voltage means increased current. Just look at Ohm's Law.

    V = IR

    Set R to, say, 1Ω. It's very obvious that:

    V = I*1
    V = I

    As V increases, so does I.

    In order to keep a constant current, your V and R would both have to increase or decrease proportionally.

    If you have a 500mA voltage source, then that means it can only supply UP TO 500mA (though it will be less in most cases).

    Current sources, on the other hand, are a different matter. If you have a 500mA current source, it will maintain a constant current and the voltage will change according to the resistance, all based on the relationship in Ohm's Law.

    I suggest reading up on Constant Voltage Sources and Constant Current Sources.

    Matt
     
  15. WBahn

    Moderator

    Mar 31, 2012
    17,716
    4,788
    The 500mA is not how much current WILL flow in the transformer, but only how much current the transformer is rated at in order to prevent damaging it.

    How much current will actually flow depends on the circuit it is powering. Clearly if the transformer output is open, then no current is flowing in the output and only minimal current will be flowing in the input windings.

    If you are using an LM317 configured as a constant voltage source, then it will supply as much current as it can while maintaining the 3V output. The limit can come from several sources. The chip itself has a maximum current that it is rated for. The chip has a maximum power dissipation and you are likely to hit that pretty quickly if you are running it from 32+V (which you would need to in order to have your supply be adjustable up to 30V). The circuit powering it will have a maximum current it can provide without dropping the regulator voltage below its dropout limit.

    Getting back to the notion of a constant current source. Imagine taking your variable voltage supply and connecting it to a load. Now measure the current and adjust the supply voltage up or down until you get the current you want. Now change the load and readjust the supply up or down until you get back to that same current. Now change the load again and repeat this process. That is all that a constant current source is doing -- it is constantly adjusting the output voltage in order to make the output current that it is sensing be equal to the target value. And, just like you doing things manually, there are limits. If you put in too large a resistor, your supply may not be able to put out enough voltage to drive the desired current through it. The same is true with a constant current source.
     
  16. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    ha.. i understand now.. so the MAX current that i will draw is about 500mA but in facts i know that is a little below...i have increasing or decreasing the V and R proportionally to get a constant current. hmm...ok got it..get to reading your suggest early.. thank you..


    i understand with your statement.. i just give an example of LM317. i know the power for chip need to dissipate high (maybe destroy it) if i hit it with 32+V in oder to get 30 V. i just forget the max current and voltage on datasheet so the example i have given just roughly.. yeah, i am "learn by doing" person so i wanna reading it and make some measurement with my adjustable power supply to get in my mind...by the way, thank you for your advise when i want convert my high voltmeter to measure it for 30 FSD. now i have the PSU adjustable with that voltmeter :D
    i just didn't want any hurt or burn around my house so i don't try yet to make ammeter before reading all basic carefully..
     
    Last edited: Oct 4, 2013
  17. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    okay.. it looks like a very simple thing just learn from theory without practice. so i make a practice with my 2 VOM and a simple series led circuit. as i increase the voltage and resistance same value, amp increasing too. as the voltage decrease, resistance steady, the amps dcreasing too.. so when i want a steady current or steady voltage refer to contanst voltage or constant current source, i have to change the voltage and resistance proportionally.. thank you for the useful advice..
    now i am very understand relationship about this !!
     
  18. crutschow

    Expert

    Mar 14, 2008
    12,990
    3,226
    No. The power dissipated in the regulator is proportional to the current through it times the voltage across it [Pdiss = I * (Vin-Vout)] thus the higher you set the regulator output voltage the lower its dissipation. The current, in turn, depends upon the output voltage and the output load resistance (I = Vout / Rout).
     
  19. MrChips

    Moderator

    Oct 2, 2009
    12,421
    3,356
    Careful!!!

    Ohm's Law is I = V/R

    LEDs are non-linear and therefore do not obey Ohm's Law.

    That is, the resistance of the LED is not constant.

    To avoid destroying your LED you must put a resistor in series with the LED. Ohm's Law applies to the resistor because the resistance is a constant.

    Measure the series current and the voltage across the resistor and apply Ohm's Law.
     
  20. Senz_90

    Thread Starter Member

    Jul 11, 2013
    70
    0
    ok. i got it. so when i hit it the higher voltage output the lower power dissipation. thanks for the formula given and i just make an example to calculate it and it makes me to progress :D btw output load resistance means that the resistance that we are hit with the output voltage? i means when i give 3v to 1K ohm series circuit, so the current is 3mA. rights? i have try before with my ammeter

    thank you for advicing.

    of course i put a resistor in series and see what happens when i change the resistor value and voltage, also i calculate before depends on led max current without dumb action just put the resistor in. when i using resistor and the voltage decrease, the led dimly and vice versa. also i measure the voltage drop across resistor.
    i have make a circuits before, but my basic about calculation Volt, amp, RMS, etc to complex circuits doesn't good, so i wanna learn it again to make sure a good progress and know what maths trough eachs component about voltage, frequency, current etc.
    thank you for advicing
     
Loading...