Help with Shunts for meter....

Thread Starter

atldave

Joined Jul 26, 2010
13
OK, I have a PM128 (Data sheet).
PM128: 1mA and an impedance of 10 MΩ

The adjustable (volts and amps) power supply is rated to give 18v at 1 amp. I couldn't get the right transformer and I have a 12.6 1.2 amp transformer feeding the PS. (It really should have a 14-16V center tapped 1A transformer, but I couldn't find one at the time.)

When I use the method here, I get an answer of 18.01 Ohms, and 18W for a resistor.


When I use an algebraic method:
Vshunt=Vmeter
(s- shunt m-meter)
IsRs=ImRm
Is = Itotal - Im = .999A
Rs=(ImRm)/Is = .001A(10,000,000Ω)/.999 = 10,010 Ω:confused:

Now, considering that the impedance for the PM128 is so high, I would think that it could be ignored?

But wait! There's more!
I have the PS connected to terminals.
I have the PM connected to those same terminals.
And I have a switch that will put a 1 Ω - 10 Watt resistor as a shunt to the PM.

Here's where it gets weird.... I attach a MM to the terminals to see if the PM is accurate.

Voltage is accurate.
Current on the other hand, when the current dial is maxed out shows .91A. BUT between 0 - .91A, the PM shows about twice as much as what the external MM shows in current. AND when I turn the MM to measure volts, it shows the same thing as the PM.

Now I understand that when taking current or any other measurements of a circuit, you're adding to the circuit. So, maybe I'm adding a couple of shunts to the circuit and that's why I'm getting such different readings?

Or, is it a design issue?

Any input would be appreciated.
P.S. For the record, I think I'm misunderstanding something really basic here and it's not getting through the concrete here (taps head).

Thank you!
 

bertus

Joined Apr 5, 2008
22,270
Hello,

The shown meter is a voltmeter, it has a range of 200 mV.
The 1mA stated is the powersupply current.

The current through the meter is almost nill, as the input resistance is 1 MOhm.
You can calculate a resistor for your current.
An 1 Ohm resistor will give you a range of 200 mV / 1 Ohm = 200 mA.
An 0.1 Ohm resistor will give you a range of 200 mV / 0.1 Ohm = 2000 mA = 2A.

Bertus
 

Thread Starter

atldave

Joined Jul 26, 2010
13
Ah, I see. I was mistaken for treating it as a galvanometer and not knowing what the amperage meant.

So a 0.2Ω resistor for a 1A range.

I'm trying to figure out why it was showing "0.91" when it was maxed out. It should have shown "0.20". The ".91" was the voltage going to the posts as confirmed by the MM.
 

bertus

Joined Apr 5, 2008
22,270
Hello,

When you take a 0.1 Ohm resistor the value of 200 mV is corresponding to 2 A of current.
When you take a 0.2 Ohm resistor the value of 200 mV is corresponding to 1 A of current. (the value on the display is double high).

Bertus
 
Top