impedance problem

Discussion in 'General Electronics Chat' started by ingram010, Oct 31, 2012.

  1. ingram010

    Thread Starter New Member

    May 2, 2007
    6
    0
    Hi I need to read the voltage from a power supply, the datasheet for the power supply states that any digital measuring device needs an input impedance of at least 10M Ohms, unfortunately the Digital measuring device I have states that the maximum input impedance is 1M Ohm. What options do I have? Is there a circuit that can increase the input impedance of my measuring device?

    Regards

    John
     
  2. t06afre

    AAC Fanatic!

    May 11, 2009
    5,939
    1,222
    What kind of power supply is this? I think you can go ahead and try your current digital measuring device. If the measuring range fits the power supply output.
     
  3. ingram010

    Thread Starter New Member

    May 2, 2007
    6
    0
    Hi Thanks for your reply

    Here is the extract from the data sheet, it is a 1KV Glassman MK series.

    Voltage monitor. J1-4
    A 0-10v signal, positive with respect to common, and in direct proportion to output voltage, is available at this pin. a 10k Ohm limiting impedance protects the internal circuitry so that a digital voltmeter with greater than 10 Megohms input impedance should be used to monitor this output. It is also acceptable to use a 1 mA DC full scale instrument (i.e analog meter) for monitor purposes.

    I am not using a multimeter to monitor the voltage I am using a labjack u3-HV.

    I have a very tight budget so I am stuck with what I have already got.

    Regards

    John
     
  4. t06afre

    AAC Fanatic!

    May 11, 2009
    5,939
    1,222
    Yes I see now. In your case it will be a question of accuracy. You may be able to calculate the error by using the voltage divider formula here
    http://en.wikipedia.org/wiki/Voltage_divider
    And since all factors are known in this case. You can also calculate the actual voltage by using the measured voltage. Is your supply a 1 KV supply? So the voltage you use is (Vout/100)
     
    Last edited: Oct 31, 2012
  5. ingram010

    Thread Starter New Member

    May 2, 2007
    6
    0
    Thanks for your help

    I was concerned that I might damage the labjack, but I see now its only the accuracy that will be affected.

    Kindest regards

    John
     
  6. JMac3108

    Active Member

    Aug 16, 2010
    349
    66
    You could use an op-amp buffer between the output of the supply and your meter to measure the power supply.
     
  7. crutschow

    Expert

    Mar 14, 2008
    13,016
    3,235
    To measure 1kV you could make a 10:1 voltage divider consisting of a 9M ohm resistor in series with the 1M ohm input resistance of your meter. Make sure the 9M ohm resistor is rated for 1kV.
     
  8. t06afre

    AAC Fanatic!

    May 11, 2009
    5,939
    1,222
    I had to Google Labjack. Is is an USB AD unit (among other things). Remember as I said. You have not a single unknown factor here. So by doing some calculations you will not loose any accuracy. I should be quite easy since I guess you are using some program language to process the data from the Labjack
     
Loading...