Designing an amplification circuit (amplifying small sensor signals)

Thread Starter

Henry603

Joined Nov 19, 2018
69
@Keep
The bias return path can be asymetrical due to thermocouple break detection or sensor break detection.

It's a place for Ib to go. You need these things with transistors and FETs too.
1.
Ok, so you would recommend one 10 MOhm resistor to ground for every input?

2.
If so, do you see issues with 10 MOhm for this low impedance sensors (10-200 Ohm) and the instrumentation amplifier I use (input impedance 100 MOhm)?
Would you go with 10 MOhm or suggest another value?

Thank you very much. :)
 
2.
If so, do you see issues with 10 MOhm for this low impedance sensors (10-200 Ohm) and the instrumentation amplifier I use (input impedance 100 MOhm)?
Would you go with 10 MOhm or suggest another value?
Sources generally are low Z.

Your effectively paralleling 100 ohms with 10 M. Generally, that doesn't introduce errors. If you were paralleling 100 oms with 100 ohms, you would get SIGNIFICANT errors. It's not 10 M either because the inputs are differential. It's probably more like 20 M.

A dangling high input Z would be prone to extraneous pick-up.
 

Thread Starter

Henry603

Joined Nov 19, 2018
69
Sources generally are low Z.

Your effectively paralleling 100 ohms with 10 M. Generally, that doesn't introduce errors. If you were paralleling 100 oms with 100 ohms, you would get SIGNIFICANT errors. It's not 10 M either because the inputs are differential. It's probably more like 20 M.

A dangling high input Z would be prone to extraneous pick-up.
Great explanation, thank you.
And as my instrumentation amplifier input is 100 MOhm that is not an issues either, right?

It only would be a problem in case my amplifier input would be equal or lower impedance than my 10 MOhm, did I get that right?
 
it's not all that easy, but for you it is. When you measure something, your measuring instrument has to have a much higher resistance than what your measuring. If you measure a 100 ohm thing with a voltage a cross it with a meter that looks like 100 ohms, the voltage is going to be half of what t's supposed to be.

10 M or 10 M || 100 M is much much bigger that 100 ohms.

==

It doesn't always work. Why? Some tube tester calibrations were done with voltmeters with 50K ohms/volt. If you used a 10 M meter, your calibration would be off.

Measuring resistances like 1e12 ohms presents challenges too. The is usually done with a current meter and a voltage source. No Voltmeter.
The voltmeter affects the measurement. I did some measurements of this type using a voltage source and a coulometer and time. At low currents all sorts of effects need to be lessened. Leakage, triboelectric, vibration. You need guards and grounds. Cable shields are covered with graphite. Triax cables (dual shields) are used.

With low value ohms, there has to be a 4-wire resistor because we need to measure the voltage across the device and the current through the device. The voltmeter probe distance set the length. The current is the same through the device and it has to originate outside of the voltage pick up.

I'll leave out professional audio and RF.

I will cover audio amplifiers. They are specified for a particular resistive load. A spec called "damping factor" which for a good amp is 100or so. It basically states that the output Z is 100x less than the rated 8 ohms.

We may start off with a multimeter which has a 10 M Impedance for voltage. I started out with one of those 20K ohms/V things and then a FET TVM. The standard was a VTVM. The current range in a multimeter is the drop across a resistor. Oops! Later, you might get introduced to the feedback ammeter sometimes known as a ZRA or Zero Resistance Ammeter. They drop < mV of voltage nearly independent of range.

Non-contact like Hall Effect is another way of measuring current.
 

Thread Starter

Henry603

Joined Nov 19, 2018
69
it's not all that easy, but for you it is. When you measure something, your measuring instrument has to have a much higher resistance than what your measuring. If you measure a 100 ohm thing with a voltage a cross it with a meter that looks like 100 ohms, the voltage is going to be half of what t's supposed to be.

10 M or 10 M || 100 M is much much bigger that 100 ohms.

==

It doesn't always work. Why? Some tube tester calibrations were done with voltmeters with 50K ohms/volt. If you used a 10 M meter, your calibration would be off.

Measuring resistances like 1e12 ohms presents challenges too. The is usually done with a current meter and a voltage source. No Voltmeter.
The voltmeter affects the measurement. I did some measurements of this type using a voltage source and a coulometer and time. At low currents all sorts of effects need to be lessened. Leakage, triboelectric, vibration. You need guards and grounds. Cable shields are covered with graphite. Triax cables (dual shields) are used.

With low value ohms, there has to be a 4-wire resistor because we need to measure the voltage across the device and the current through the device. The voltmeter probe distance set the length. The current is the same through the device and it has to originate outside of the voltage pick up.

I'll leave out professional audio and RF.

I will cover audio amplifiers. They are specified for a particular resistive load. A spec called "damping factor" which for a good amp is 100or so. It basically states that the output Z is 100x less than the rated 8 ohms.

We may start off with a multimeter which has a 10 M Impedance for voltage. I started out with one of those 20K ohms/V things and then a FET TVM. The standard was a VTVM. The current range in a multimeter is the drop across a resistor. Oops! Later, you might get introduced to the feedback ammeter sometimes known as a ZRA or Zero Resistance Ammeter. They drop < mV of voltage nearly independent of range.

Non-contact like Hall Effect is another way of measuring current.
Ok so now I know that im safe with the 10MOhm resistors to ground in front of my instrumentation amplifier.

That was a very nice explanation, thank you for your time! :)
 
Top