Discussion in 'Homework Help' started by prejval2006, Aug 23, 2007.
wt is the difference between resolution and sensitivity in a instrument ?
This is a good question, one that I am not familiar myself. Try this link:
a guess that i can take would be
resolution is the least count of the scale however sensitivity is the deflection that occurs per unit input value which may not equal resolution (it may be more or less than it)
I tend to think of sensitivity as more likely to be used in the context of an analog signal as in the sensitivity of the output of an opamp to the variation in gain, temperature, or a particular component value. I tend to use the term resolution when referring to the result of converting an analog signal to a digital representation such as when referring to the output of an A-to-D converter, we state that it has a resolution of 10 bits. That typically means the converter can divide the analog signal applied to its input into 2 raised to the tenth power unique 10-bit digital values.
I would suggest sensitivity is related to the minimum permitted value for a given output value - it is a fixed constraint, and as hgmjr says is mostly in the analogue world.
Resolution on the other hand is the current level of information or detail present in a system - it is not a fixed constraint, more a current evaluative expression of some system property.
This is purely my interpretation and may not be wholly accurate.
here is something which i learned in instrumentation.
resolution is the smallest increment in input value that can be measured by the instrument. like a temp variation less than say 1 deg centigrade may not be detected by a given thermocouple but anything above that variation might be then that becomes its resolution.
sensitivity on the other hand gives you the change in output parameter with respect to change in input parameter. like a temp change of 1 deg is calculated to produce a voltage of 5 mV in a thermocouple then that becomes its sensitivity.