Series resistor question

Discussion in 'General Electronics Chat' started by DIYSteve, Jun 8, 2012.

  1. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    I find references all over the web explaining how to measure a resistor connected to a power supply in series using a voltmeter by measuring the voltage drop and doing calculations. But I haven't yet found what a voltmeter would read if it was connected in series with the resistor and the power source.

    If I have a regulated 12 V source, and I connect a 3 ohm resistor to one terminal. And I connect one lead from a voltmeter to the other end of the resistor and the the other voltmeter lead to the other terminal of the source, what does the voltmeter read?

    Does this depend on the voltmeter's resistance, and/or the internal resistance of the power source? For practical purposes where the voltmeter resistance is very high and the power source internal resistance is low what would the voltmeter read?

    Thanks. This seems like a very simple question that I ought to know the answer to, or at least find online, so a little embarrassed to ask it. But when I look up "measure resistance with voltmeter" I get all the parallel connected meter answers, or even sites discussing multimeter ohms settings and how to use them.
     
    Last edited: Jun 8, 2012
  2. WBahn

    Moderator

    Mar 31, 2012
    17,718
    4,786
    Do the following:

    Model your supply as an ideal voltage source, V, in series with a source resistor, Rs.

    Model your meter as a resistor, Rm, and assume that the meter will display whatever voltage appears across Rm.

    Model your other resistor as simply R.

    Now connect up your circuit however you want and analyze it to determine the voltage across Rm.

    You can then answer questions like the ones you mentioned.

    BTW: The connect as you describe it is how you would measure a circuit's Thevenin equivalent voltage. There are some caveats if you want to do this in real life, but for the kinds of circuits that most people learning about Thevenin circuits, it will work fine.
     
  3. K7GUH

    Member

    Jan 28, 2011
    191
    23
    That's a good way to ruin a voltmeter. If you believe in Ohm's law, a three ohm resistor across 12 volts will draw 4 amps. More than enough to let all the blue smoke out of the meter.
     
  4. WBahn

    Moderator

    Mar 31, 2012
    17,718
    4,786
    Keep in mind that it is a voltmeter being asked about.
     
  5. mcgyvr

    AAC Fanatic!

    Oct 15, 2009
    4,769
    969
    You place a multimeter in series with a load to measure the current through the circuit. (BUT they require you change the red lead to a different terminal on the multimeter to do this) Those ports of the meter are fused (typically one for milliamp readings and one for currents under 10 amps)
     
  6. WBahn

    Moderator

    Mar 31, 2012
    17,718
    4,786
    I'm pretty sure he is asked the question he intended to ask (though the first sentence is a bit hard to decipher, although I think it is also extraneous).

    He can correct me if I am wrong, but I think he is asking the following:

    "I know how to measure the voltage across a resistor by placing the meter in parallel with it. But I'm curious what the voltmeter would read if I were to place it in series with the resistor, instead."

    It's a worthwhile question because it then lends itself to asking how you could turn a voltmeter into an ammeter or how you could turn an ammeter into a voltmeter, or how you create ranges on either that the meter didn't come with, and other useful topics.
     
  7. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    Thank you all for such quick answers.

    Yes I meant the voltmeter was in series with the resistor and the power supply, That wouldn't hurt the voltmeter any more than eliminating the resistor and connecting the leads directly to the power supply. In which case you would read the supply voltage on the meter.

    My question is what the voltmeter reads if we insert a resistor between the voltmeter and supply.
     
  8. dataman19

    Member

    Dec 26, 2009
    136
    29
    The voltmeter would measure the voltage drop of the load (minus the series resistor).
    The amperage would be the main concern, the voltmeter would have to be one with overload protection, and or a shunt.
     
  9. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    Thank you WBahn, this makes sense, however I don't know what the source resistance is (a used computer regulated PSU 12V rail) and the meter internal resistance is.

    I was wondering if the R source is very low, effectively, and the meter internal resistance is effectively very high, is there a reasonable practical answer to the case for a 3 ohm resistor and a 12 volt source?
     
  10. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    Again, connecting a voltmeter directly across the terminals of a power supply is an everyday practice, and the curent is extremely low. Adding a resistor in series here would reduce it even further.

    Remember, this is a voltmeter, not an ammeter.
     
  11. #12

    Expert

    Nov 30, 2010
    16,246
    6,733
    The 3 ohm resistor and the resistance of the meter form a voltage divider.
    For a 1 million ohm volt meter, the current will be 12V/1,000,003 ohms.
    The voltage shown by the voltmeter will be 12 x 1,000,000/1,000,003
     
  12. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    I think in clarifying my questions I'm realizing the practical answer or at least the trend of it.

    1.) A regulated power supply will act like almost an infinitesimal supply resistance up to the point where it can't regulate very well.

    2.) A very low ohm resistor will create a very tiny drop in current when added to the meter's very high internal resistance (say 1 megohm/volt), and a correspondingly and proportionately tiny drop in voltage in meter reading. And this, for practical purposes may be un-readable.


    EDIT: whoops wrote this before I saw the above answer!
     
  13. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    Okay so that won't work.

    So here's my practical problem:

    I want to test the resistance of the 3 ohm resistor. It is a stainless steel wire calculated to be 3 ohms at powered up temp with a 12V supply.

    But I want to check that, without the possibility of destroying my supply if the calculation is wrong.

    I'd like to generalize this procedure in case an ohmeter isn't available, and in fact an ohmeter won't heat the wire and provide a representative resistance anyway.

    Any suggestions?
     
    Last edited: Jun 8, 2012
  14. #12

    Expert

    Nov 30, 2010
    16,246
    6,733
    First impression: expect 4 amps to flow through the 3 ohm resistor that is connected to the 12 volt supply. Measure the current.

    I have a feeling that I missed an important point, like, are you trying to measure the effects of temperature on a stainless steel wire?
     
  15. WBahn

    Moderator

    Mar 31, 2012
    17,718
    4,786
    You have now provided a couple of items of additional information that is very useful, in more ways than one.

    Using a supply from a computer needs to be done with some caution because they are designed with the assumption that any time they are turned on that there will be at least some minimum amount of current being drawn from them. If you aren't drawing at least that current, then the supply will not regulate the output properly. I think, at least with some older supplies, that powering them up without a load would actually damage them after a while, but I don't know if I'm recalling that correctly or not.

    In order to calculate the resistance, you need to know the voltage across the resistor and the current through the resistor.

    To measure the voltage across a resistor, you typically want to put a high value known resistance in parallel with it and then measure the current in that metering resistor. That's essentially what a voltmeter does, though the details vary depending on the type of meter.

    To measure the current through the resistor, you typically want to put a low value known resistance in series with it and then measure the voltage across that current sense resistor. That's essentially what an ammeter does, though again the details vary depending on the type of meter.

    The easiest thing to do would be to get a second meter, set up as an ammeter on something like a 10A range, and put it in series with the wire resistor. However, you need to look at the meter specs because this might be getting down into a region that the meter wasn't really designed to operate in and it might be affecting the measurement significantly. One way to tell is to hook it up and use the voltmeter to measure the voltage across the ammeter; if it is more than about 600mV (5% of the supply voltage), you are probably affecting the circuit more than you would like to.

    If you don't have such an ammeter or would just like to know of an alternative, try to find a 0.1Ω resistor rated at 1W or better to use as your current sense resistor. Then put it in series with your wire resistor and power the whole thing up. Then measure the voltage across the wire and across the current sense resistor. You can either use two voltmeters or just use one voltmeter and take the measurements one at a time.

    Now, anytime you take a measurement you affect what is being measured. In this case, your current is being reduced from nominally 4A to 3.87A, so your wire is going to be a bit cooler than you would like. The meters will also affect the measurement, but you are using them in the way they were designed and their resistance will be high enough that the disturbance will be undetectible.

    The big thing you need to worry about is that your measurement will only be as accurate as your knowledge of the value of that 0.1Ω resistor at the time that the measurement is made. There are two issues. First, just because the resistor claims to be 0.1Ω doesn't mean it is. But you can deal with that by measuring its resistance beforehand. Just using a ohmeter probably isn't going to be good enough because most are not designed to measure resistances this small, but some are. You could also measure it's resistance the same way that you're going to measure the wire's. This isn't as circular as it sounds. You can use a lower current (and hence use an ammeter with a lower range) or use a precision current sense resistor of a much higher value. So, for instance, you could use a 1.2V battery and a 2Ω 1% 1W resistor. The current will be about 600mA so you should measure about 60mV across the 0.1Ω resistor which should be sufficient if your voltmeter is halfway decent.

    The other problem is that resistors change value as they change temperature (whether you would like them to or not). That can be addressed here, to the degree needed, by simply taking the measurements immediately after getting the wire hot enough. If it is going to take a while get the wire to temp, then short the current sense resistor with something that is much less than 0.1Ω and remove it just before you are ready to take the measurement.
     
  16. DIYSteve

    Thread Starter New Member

    Oct 27, 2011
    24
    0
    Thanks #12.

    No it's actually a hot wire for foam cutting, and I just didn't want to blow a power supply.

    The problem with expecting 4 amps through the circuit and checking that with an ammeter, is what if it isn't 4 amps, but 8 or 12 due to a mistake in calculation of the wire resistance?

    But I've come to realize now that I can just put a 5 amp fuse inline with the ammeter, which ishould be fused anyway, and then I can then check the amps.

    The only fly in the ointment would be if the resistance is low enough with the wire cold to blow the fuse before the wire heats up. Not sure about that.
     
  17. WBahn

    Moderator

    Mar 31, 2012
    17,718
    4,786
    Reasonable concern. Put a suitable switch in parallel with the meter and close it before powering up the circuit. Then after the wire is at temp, open the switch and make your measurement.
     
  18. #12

    Expert

    Nov 30, 2010
    16,246
    6,733
    Puzzled that you can't use an ohm meter to measure the alleged 3 ohm piece of wire but...

    You can do this experiment: Use a 24 volt supply and place a known good 3 ohm resistor in series with the alleged 3 ohm piece of wire. The absolute max currrent will be 8 amps which is within the range of a typical ampmeter, even the $4 type.

    Of course you can use any voltage greater than 12 with proportional resistor or any voltage less than 12 with no added resistance and measure current. This is just a way to set up an experiment. I think it is enough information for you to make progress.
     
  19. #12

    Expert

    Nov 30, 2010
    16,246
    6,733
    WBahn just did a good setup with the switch idea in post #17

    I've done that but I didn't think of it first, today.
     
  20. WBahn

    Moderator

    Mar 31, 2012
    17,718
    4,786
    Because he needs to measure the resistance while the wire is at the temperature it is at when being powered by the 12V supply. You could rig up a switch arrangement to switch out the 12V and switch in the other lead of the ohm meter using a SPDT switch. These wires cool very quickly, though, so it might be changing enough during the course of the measurement so as to give very skewed results.
     
Loading...