Im trying to read the resistance of a sensor. The sensor resistance varies from 0 to 500k Ohms. I'm supplying the sensor with a voltage of 1V and im reading current that flows through the resistor with a sens resistor and an opamp to amplify the signal. The circuit is in the attachment. I've simulated it and it works perfect. Lets say for example my Sensor is 430k Ohms and im using a 10k ohm sens resistor. The opamp amplifies the voltage across the sens resistor by 20 times. The voltage across the sens resistor is around 30mV and the output signal of the opamp is 600mV.
The used opamp is AD8418, it amplifies the signal by 20 times and it has an offset voltage of 100uV. The supply voltage range is between 2.7V and 5.5V.
But after testing it on a breadboard, the voltage across the sens resistor acts weird when i plug it into an opamp. So the voltage across the sens resistor is perfectly 30mV without the opamp. But when i connect it to the input of the opamp, the voltage becomes 45mV instead of 30mV. And the output voltage is 900mV.
The question/problem is: Why did the voltage change across the sens resistor? If the sensor resistance decreases then the error decreases a bit as well.
The used opamp is AD8418, it amplifies the signal by 20 times and it has an offset voltage of 100uV. The supply voltage range is between 2.7V and 5.5V.
But after testing it on a breadboard, the voltage across the sens resistor acts weird when i plug it into an opamp. So the voltage across the sens resistor is perfectly 30mV without the opamp. But when i connect it to the input of the opamp, the voltage becomes 45mV instead of 30mV. And the output voltage is 900mV.
The question/problem is: Why did the voltage change across the sens resistor? If the sensor resistance decreases then the error decreases a bit as well.
Attachments
-
10 KB Views: 9