I'd like to design a meter with the following requirements:
Microcontroller: Atmel ATMEGA / similar
Object to test: expected to be between .03 ohm and 4 ohm
Interface to object: Proprietary connector with v+ and ground
Output: oled display
Budget: $350 in components and boards / unlimited labor (mine)
I have been asked to do very accurate and precise ohm meter that is capable of properly measuring resistances down to .03 ohms. I understand how challenging this is, and if it's not possible for my budget, I'll head back with a decline on the request!
Based on my understanding of all of this, I'm going to need:
A high quality constant current driver capable of driving a very specific current
A high quality reference voltage for the ADC
A high quality ADC dedicated to measuring the voltage drop across the subject for the voltmeter portion of the measurement ( I was thinking of the LTC2400: http://www.linear.com/product/LTC2400)
As good a reference resister as I can afford for doing current measurements. This will allow me to calibrate the device based on the output of the constant current driver.
What I need is a basic block diagram of how I'd set this up so that I'm not making a dumb mistake. It SEEMS to me that I'd have a microcontroller connected to the LTC2400 over SPI that would first check for NC / high resistance if the subject isn't plugged in. If plugged in, it would queue up the CC driver and then test and save the current for later arithmetic. Then, it would test the voltage across the subject, do the math and display the output.
This very likely is far harder (at least in terms of getting a precise and accurate result) than I am thinking, and am more than happy to get responses that indicate why this is so. I suspect I'm missing a second ADC for testing the voltage across the subject (in addition to the one being used to test the current).
I sincerely appreciate any help anyone can give, and thank you in advance for making it this far!
Microcontroller: Atmel ATMEGA / similar
Object to test: expected to be between .03 ohm and 4 ohm
Interface to object: Proprietary connector with v+ and ground
Output: oled display
Budget: $350 in components and boards / unlimited labor (mine)
I have been asked to do very accurate and precise ohm meter that is capable of properly measuring resistances down to .03 ohms. I understand how challenging this is, and if it's not possible for my budget, I'll head back with a decline on the request!
Based on my understanding of all of this, I'm going to need:
A high quality constant current driver capable of driving a very specific current
A high quality reference voltage for the ADC
A high quality ADC dedicated to measuring the voltage drop across the subject for the voltmeter portion of the measurement ( I was thinking of the LTC2400: http://www.linear.com/product/LTC2400)
As good a reference resister as I can afford for doing current measurements. This will allow me to calibrate the device based on the output of the constant current driver.
What I need is a basic block diagram of how I'd set this up so that I'm not making a dumb mistake. It SEEMS to me that I'd have a microcontroller connected to the LTC2400 over SPI that would first check for NC / high resistance if the subject isn't plugged in. If plugged in, it would queue up the CC driver and then test and save the current for later arithmetic. Then, it would test the voltage across the subject, do the math and display the output.
This very likely is far harder (at least in terms of getting a precise and accurate result) than I am thinking, and am more than happy to get responses that indicate why this is so. I suspect I'm missing a second ADC for testing the voltage across the subject (in addition to the one being used to test the current).
I sincerely appreciate any help anyone can give, and thank you in advance for making it this far!