Interface Pressure Sensor to dsPIC33 MCU

Discussion in 'Embedded Systems and Microcontrollers' started by dchapman, Apr 29, 2011.

  1. dchapman

    Thread Starter New Member

    Apr 27, 2011
    2
    0
    Hello!

    I am designing a water system monitor that among other things will interface to a pressure sensor. I plan to use a dsPIC33 and a Honeywell MLH series compensated sensor that outputs 1V-6V (regulated) over a 0 to 100psi range.

    As this is my first foray into ADC interfacing I have spent quite some time going through previous posts here and have learned a lot. Thanks to all those who have invested their time in answering questions on this forum! I am posting this to solicit review comments for what I think is a workable solution based on what I have learned. I also have some questions about how to determine required component values.

    The dsPIC will be powered with Vcc=3.3V and I will use internal reference Avcc=3.3V. dsPIC specs recommend ADC input impedance of 2.5Kohms for 12bit ADC and 1Kohms for 10bit ADC input. As I only need pressure measured to + or - 1psi resolution either ADC input will work fine.

    The sensor specs (here http://search.digikey.com/scripts/DkSearch/dksus.dll?Detail&name=480-2573-ND ) seem to indicate that the sensor can source 1mA of current with a sensor output impedance of 25ohms.

    My proposed interface consists of 15V unregulated DC in to sensor. Output of sensor will be scaled by 50% to the range 0.5V to 3V and input to ADC pin. The 50% range reduction will be provided by passing the sensor input through two resistors connected in series and then to ground, each with identical value of 5K ohms (using 1% metal film resistors to reduce noise). The ADC input is then connected to the junction where the two series resistors are joined together, ie.

    |------------------------15V DC unregulated
    v
    Sensor
    |
    v
    R1
    |
    | -----> ADC input
    |
    R2
    |
    Ground

    R1=R2=5Kohms

    Sensor will need to supply a max of 6V/(5K+5K)=.6 mA which is less than the 1mA available. If I have understood properly the output impedance of the scaled sensor (ie. the impedance that the ADC pin sees) is 2.5K ohms. (I believe this is the Thevenin equivalent resistance of the circuit - just the two resistors in parallel?). 2.5K ohms works for the 12bit ADC but is too high for the 10bit ADC.

    I therefore also plan to place a 1uF capacitor to ground on the ADC input pin to provide noise reduction and to reduce input impedance (ie provide ready source of current when ADC samples the line). This should allow both 10 and 12 bit ADCs to work. The capacitor charge time (assuming an ADC sample event significantly discharges the cap) is 5 x RC = 5 x 5000 x .000001 = 25ms (or should I use the Thevenin equivalent resistance of 2500 ohms here?). So as long as I don't sample more often than once every 25ms the cap should not negatively impact measurement.

    The pressure changes relatively slowly - I will be providing temp measurements to data logger at most every 10sec. Therefore I plan to bypass the complications of a FIR filter and use a very simple software filtering scheme - sample the input every 25ms for 1 sec (ie take 40 samples) and then average them.

    I'm wondering if I could use a smaller capacitor of, say .1uF, and increase the resistor values to 10K each? This would reduce the current required from the sensor to .3mA and 5RC is now 5ms which would allow me to average more samples with each measurement. I am unable to figure out from the dsPIC33 specs how much charge is required when it takes a sample and whether .1uF is large enough.


    Thank you for taking the time to look at this and commenting. Other suggestions for how to approach this would be welcomed - although by using the compensated sensor I was hoping to avoid needing an OP amp for the interface.
     
  2. retched

    AAC Fanatic!

    Dec 5, 2009
    5,201
    312
  3. GetDeviceInfo

    Senior Member

    Jun 7, 2009
    1,571
    230
    either scenario should perform well. What may impact your design is physical location of sensor (environmental noise, stress, temp, etc), it's leads (AC noise), and your circuit layout. Are you employing input isolation?
     
  4. dchapman

    Thread Starter New Member

    Apr 27, 2011
    2
    0
    No, I wasn't thinking about input isolation. In this initial use of the system, it would be installed in a pump house (which is conditioned space). The MCU - sensor wire length is 3 meters due to physical considerations. There are 120/240V wires around.

    However, a neighbour is interested in installing it as well so I would like a reasonably robust solution that doesn't depend on a known pristine environment.

    What kind of isolation do you suggest? Optical? or is the use of an instrumentation amp as suggested in the previous reply sufficient?
     
  5. retched

    AAC Fanatic!

    Dec 5, 2009
    5,201
    312
    Sensitive readings, temperature change, and resistive voltage dividers do not play nice together.

    The temperature change CAN change the resistor value, in turn putting the kaput on any previous calibration.
     
  6. GetDeviceInfo

    Senior Member

    Jun 7, 2009
    1,571
    230
    Doesn't sound like isolation is required, but you may want to consider supression. My approach would be to construct the circuit and evaluate it under realtime operation. Different installations may require different solutions, but you can bridge that gap when you get there.
     
Loading...