Accelerometer sensitivity unit confusion ..#2

Thread Starter


Joined Feb 16, 2021
I am building my own Dynamic balancing rig and trying to select appropriate accelerometers.
Sensitivity listings seem to be given in either mV/g (milivolts per g), LSB/g (least significant bit/g), counts/g, or mg/digit.
The only one I understand is mV/g. Would appreciate help deciphering the others and how they relate in terms of mV/g.
Thanks in advance.

I want to use a acceleration sensor that has this sensivity
  • Sensitivity ( ug/digit) 3.9, 7.8, 15.6
but on my acceleration sensor's has 0.061 mg/LSB sensitivity
Do you know what is the difference between digit and lsb ?

Thanks in Advance

Mod: Link to other Thread.E
Last edited by a moderator:


Joined Jan 15, 2015
Accelerometer Sensitivity:
The ratio of change in acceleration (input) to change in the output signal. This defines the ideal, straight-line relationship between acceleration and output (Figure 1, gray line). Sensitivity is specified at a particular supply voltage and is typically expressed in units of mV/g for analog-output accelerometers, LSB/g, or mg/LSB for digital-output accelerometers. It is usually specified in a range (min, typ, max) or as a typical figure and % deviation. For analog-output sensors, sensitivity is ratiometric to supply voltage; doubling the supply, for example, doubles the sensitivity.

I suggest you read the link. Not mentioned are accelerometers with an output specified as Pico coulomb / g.

When choosing a sensor we consider range we expect and do not forget frequency response.