Problem with load cell calculation

Thread Starter

ed-dub

Joined Mar 13, 2016
5
Hi all,

I've spent many hours reading posts across the forums on this site and I have learned a great deal but I find myself stuck trying to understand the load cell calculation I have inherited. I am in the process of rewriting software that obtains reading from a load cell via a micro-controller in real-time and displays the result in a graph. While I could just take was in the code and move it to the rewrite, I was hoping to understand why it is written the way it is.

What I have so far ...

The load cell specifications I found here: http://discountloadcells.com/doc/spec_sheets/s_type/hbm_rsc.pdf. Note that mine has a 500lb rating rather than kg.

The calculation in code is:
LC Factor = ((LC Rating) /( LC Calibration * LC Sensitivity * Excitation V)) * Zero Offset

Scale Reading = (LC Rating * V measured) / LC Factor

where:
LC Rating = 500lbs (from load cell markings/spec sheet)
LC Calibration = 2 <-- don't understand where this came from
LC Sensitivity = 2 (specsheet mV /V)
Excitation V = 5 (from micocontroller, spec sheet, and digi-meter)
Zero Offset = .004882 V (looks to be the measure voltage for the load cell unloaded. it's roughly what the digi-meter shows at least)

Given the above, the load cell factor is 0.12207. At a voltage reading of 0.015, the code outputs 61.44 lbs.

Given everything I have read, the calculation does not seem correct. I especially do not understand where the Load Cell calibration value originates and why there is a load cell factor involved. Everything that I have read seems to indicate that the calculation to derive lbs should be as simple as (LC Rating * V measured)/ (LC Sensitivity * Excitation V) and the offset subtracted (i.e. Y = MX + B). Could someone help me:

1) Understand why this calculation is written the way it is (i.e. Load Cell factor)
2) Where the LC Calibration is coming from and what it represents

Again I could take it at face value and move on but it's driving nuts trying understand and I just can't let it go.

Thanks in advance.

-Ed
 

Thread Starter

ed-dub

Joined Mar 13, 2016
5
Max,

Thanks for taking the time to reply and doing it so quickly. After skimming through the link you provided, I have to ask, did you post the right one? I'm not finding a reference to the MCP3421 and/or code.

-Ed
 

Thread Starter

ed-dub

Joined Mar 13, 2016
5
@MaxHeadRoom thanks again for the reply and sharing the code. I am really only looking to understand the calculation I shared. In working my way through the sample application in the attachment, I am still unable to find a clear explanation of the formula or even if it is correct. I'll guess I will have to take it at face value unless you or someone else has the time to clarify.

-Ed
 

WBahn

Joined Mar 31, 2012
30,058
Hi all,

I've spent many hours reading posts across the forums on this site and I have learned a great deal but I find myself stuck trying to understand the load cell calculation I have inherited. I am in the process of rewriting software that obtains reading from a load cell via a micro-controller in real-time and displays the result in a graph. While I could just take was in the code and move it to the rewrite, I was hoping to understand why it is written the way it is.

What I have so far ...

The load cell specifications I found here: http://discountloadcells.com/doc/spec_sheets/s_type/hbm_rsc.pdf. Note that mine has a 500lb rating rather than kg.

The calculation in code is:
LC Factor = ((LC Rating) /( LC Calibration * LC Sensitivity * Excitation V)) * Zero Offset

Scale Reading = (LC Rating * V measured) / LC Factor

where:
LC Rating = 500lbs (from load cell markings/spec sheet)
LC Calibration = 2 <-- don't understand where this came from
LC Sensitivity = 2 (specsheet mV /V)
Excitation V = 5 (from micocontroller, spec sheet, and digi-meter)
Zero Offset = .004882 V (looks to be the measure voltage for the load cell unloaded. it's roughly what the digi-meter shows at least)

Given the above, the load cell factor is 0.12207. At a voltage reading of 0.015, the code outputs 61.44 lbs.

Given everything I have read, the calculation does not seem correct. I especially do not understand where the Load Cell calibration value originates and why there is a load cell factor involved. Everything that I have read seems to indicate that the calculation to derive lbs should be as simple as (LC Rating * V measured)/ (LC Sensitivity * Excitation V) and the offset subtracted (i.e. Y = MX + B). Could someone help me:

1) Understand why this calculation is written the way it is (i.e. Load Cell factor)
2) Where the LC Calibration is coming from and what it represents

Again I could take it at face value and move on but it's driving nuts trying understand and I just can't let it go.

Thanks in advance.

-Ed
One way to see that the formula for LC Factor (where did IT come from? I didn't see it in the data sheet) is highly suspect is to ask what the LC Factor would have been had the load zero output been 0 V when unloaded.
 

Thread Starter

ed-dub

Joined Mar 13, 2016
5
I would tend to agree based on reading. However, the zero offset in the factor is a hard coded constant in the code. I have no idea where the calibration value us coming from but at least it an input setting with a default value of 2. I hesitant to just rewrite the formula without a good understanding. A different formula will likely have different scale readings and that I will need to explain.

-Ed
 
Top