I have an Omega LVDT (model is LD620-50) that I want to calibrate to use in lab tests. I'm using it alongside a process controller (model is CNiS16D22-C4EI) and I have gauge blocks that I want to use to calibrate it. I have 4 gauge blocks (1.3mm, 1.9mm, 7.5mm, and 18mm) and although the range of this LVDT is +/- 5cm, I only need a range of 2cm for my application.
I tried calibrating the LVDT by using the Input/Reading part of the controller's menu, and Load enabled so that the calibration is done online. I understand that I have to place the LVDT at the specified lengths, thus inducing a voltage that the controller reads (Input), and then I have to make the Reading the appropriate measurement in millimeters.
I tried using 5 linearization points (my four gauge block lengths and 0) and setting it up with the controller as described above. However, once I start measuring in run mode, I get very wild numbers that are off by as much as 9% error even when I simply place the gauge blocks used to calibrate it under the probe.
Am I doing this the right way? It seems very strange that right after setting up the input/reading points, I get numbers that are so wrong.
I tried calibrating the LVDT by using the Input/Reading part of the controller's menu, and Load enabled so that the calibration is done online. I understand that I have to place the LVDT at the specified lengths, thus inducing a voltage that the controller reads (Input), and then I have to make the Reading the appropriate measurement in millimeters.
I tried using 5 linearization points (my four gauge block lengths and 0) and setting it up with the controller as described above. However, once I start measuring in run mode, I get very wild numbers that are off by as much as 9% error even when I simply place the gauge blocks used to calibrate it under the probe.
Am I doing this the right way? It seems very strange that right after setting up the input/reading points, I get numbers that are so wrong.