How is 11/12 bit precision useful on the DS18B20 temperature sensor?

Discussion in 'General Electronics Chat' started by Roger at CCCC, Dec 24, 2015.

  1. Roger at CCCC

    Thread Starter Member

    Jun 8, 2009
    How is 11/12 bit precision useful on the DS18B20 temperature sensor?
    I'm trying to understand the meaning of the specifications for accuracy and precision for the DS18B20 temperature sensor. The DS18B20 datasheet says that the DS18B20:

    "is accurate to ±0.5°C over the range of -10°C to +85°C.....The resolution of the temperature sensor is user-configurable to 9, 10, 11, or 12 bits, corresponding to increments of 0.5°C, 0.25°C, 0.125°C, and 0.0625°C, respectively."

    But if the DS18B20 is only accurate to 0.5 degrees, then why would any application ever use 11 or 12 bit precision? For example, if the DS18B20 returned a reading of 20 degrees, then actual temperature could be from 19.5 to 20.5. Why would any application ever care whether 11 or 12 bit precision might return a reading of 20.125 or 20.0625, since this difference is much less than the accuracy of the reading ?

    Thanks for any response or explanation
  2. OBW0549

    Well-Known Member

    Mar 2, 2015
    Because there are plenty of applications in which absolute accuracy is of only secondary concern next to precision and the ability to discern small changes in the measured quantity, whether temperature or anything else.

    Also, many applications do not rely on raw sensor accuracy to achieve their ultimate performance requirements; rather, they start with a "fairly" accurate sensor, one which is stable and repeatable, and then enhance the accuracy of the overall system through calibration against some sort of standard (such as a standard-grade Platinum resistance bulb, for temperature) to obtain correction factors for offset and scale factor.
    Last edited: Dec 24, 2015
  3. Picbuster

    Active Member

    Dec 2, 2013
    You hit a common question.

    Accuracy from a thing does say something about the production. Assume that the accuracy is 1% and its value 100. You have a thing with a value from factory of 99, 100 or 101.

    Repeatability is the important part it could run 0.0001% accurate each time within certain limits. However; from factory is could deviate 1% here is your calibration needed.

    Measuring temperature and humidity are the most difficult things hence you measure at the probe but what is the value 1 meter away?