Need help configuring LM3914 for temperature gauge

Thread Starter

zap250

Joined May 4, 2014
6
I'd like to use an LM3914 with a 10-segment tri-color LED bargraph to display the temperature from a motor which has a temperature sensor. The temperature sensor is a KTY84/130 (http://evmc2.files.wordpress.com/2013/02/kty84_ser_6.pdf). The motor must not exceed 140 degrees C. So I want to have the first LED come on at 40 C, and the next one at 50 C, and so on until the last LED comes on at 130 C. Using it in bar mode, so all LEDs should be lit at 130 degrees. The chart below shows the resistance values in this range:


So I've picked the following sample circuit from the LM3914 datasheet which should work well and would make the final LED flash which is a nice feature - but of course I need this to work with the thermal sensor input rather than a 0-5V input.


Also I will be using it with a 12V battery. Since the sample circuit is for 5V, I'll probably just add a 7805 regulator to power the circuit.

Finally, I want to add a relay to run a cooling fan when LED # 6 turns on. Ron H has posted a simple solution on this forum which I plan to use to make that work:


So my main roadblock right now is how to configure the LM3914 to read the resistance of the temperature sensor I'm using and map it out properly to the 10-LED bar?
 

MrChips

Joined Oct 2, 2009
30,795
The LM3914 can operate fine at 12V.

If you connect a 2kΩ resistor from +12V to the temperature sensor and the other end of the temperature sensor to ground, you will get a linear voltage output from 3.0V to 4.5V over your temperature range of 40°C to 130°C

Adjust the RLO and RHI inputs to cover the 3-4.5V range.
 

Thread Starter

zap250

Joined May 4, 2014
6
Thanks for the helpful responses.

@eetech00: Thanks, much appreciated! The circuit you provided looks like it would be precise and stable, but I'm trying to keep this as simple as possible with few parts, as I'm not really that confident at soldering a bunch of components or understanding all they do...

@MrChips: This sounds like a simple solution, which I like. With the resistor divider the current dissipated by the temperature sensor would be about 4 mA which will skew the results slightly, but that's OK, it should still be close enough for my purpose.

I figure I could set the 3.0V lower limit by connecting PIN 4 to a voltage divider (ie. V+ - 3KΩ - Rlo - 1KΩ - GND). But how can I set the upper limit to 4.5V?
 

Thread Starter

zap250

Joined May 4, 2014
6
I thought I had it figured out, but the actual circuit didn't work as planned.

@eetech00:

Can you explain your circuit? I understand the X2 and X1, and the various resistors and capacitors. But what is U3? It looks like a relay, but if so then what's its purpose in the circuit? I can understand V3 is the 5V supply used for the LEDs, and V4 is the main 12V supply used for everything else. So, what is V1? And what about the two variable resistors - how should I set them? Thanks! :)
 
Last edited:

Thread Starter

zap250

Joined May 4, 2014
6
(Sorry, I meant to post this as a new topic, but for some reason it's just not working right - mods, could you kindly move it to a new topic?)

I got an analog meter that shows temperature from 40°C to 120°C, it's designed to be used in a car and runs on 12V, it comes with a temperature sensor that has a negative temperature coefficient, but I have a different temperature sensor installed that I really need it to work with instead, and that one unfortunately has a positive temperature coefficient.

But there is a simpler way of looking at this problem: connecting the temperature signal input to a voltage source, the meter stays at its lowest reading of 40°C when connected to zero volts, and the meter shows its highest reading of 120°C when connected to +12V. An in-between voltage of course would show a corresponding reading somewhere in-between. The meter itself can be powered by the same 12V source.

So, I figure that if I can connect my own temperature sensor to an op-amp that will convert the sensor's resistance to a voltage ranging from approximately 0 volts to approximately 12 volts, and feed the op-amp's output to the meter, that ought to work well enough. It should be scaled such that the op-amp would output close to +0 volts at 40°C (672Ω) and gradually increase to approach +12V at 120°C (1127Ω). Could someone help design a simple op-amp circuit that would do the trick? It doesn't have to be very accurate, and it should be simple. The temperature sensor I'm using is a KTY84/130 with the following specifications:

 

eetech00

Joined Jun 8, 2013
3,942
I thought I had it figured out, but the actual circuit didn't work as planned.

@eetech00:

Can you explain your circuit? I understand the X2 and X1, and the various resistors and capacitors. But what is U3? It looks like a relay, but if so then what's its purpose in the circuit? I can understand V3 is the 5V supply used for the LEDs, and V4 is the main 12V supply used for everything else. So, what is V1? And what about the two variable resistors - how should I set them? Thanks! :)
Hi

U3 is a SPDT toggle switch that toggles the LM3914 between Bar and Dot mode. I need to fix that symbol, it's kinda crummy...:D

V1 is for simulation purposes only and is used to set the SPDT to BAR mode or DOT mode. Use a SPDT toggle switch for your actual circuit.

Adjustments-
Set this first:
P1 sets the current thru the sensor. Adjust this pot while measuring the current flow thru the sensor. Set it as best you can to 2 milliamps.

Set this second:
The LM3914 input signal range is from 3 volts (first LED-40C) to 4.5 volts (last LED-130C). P2 adjusts this input signal level.
I don't know what you intend to use as a temperature source for calibration, but you would adjust P2 for either the highest or lowest temperature being measured, while checking that the corresponding LED illuminates. You may have to readjust over time as components begin to stabilize..

Another way to adjust P1 and P2 is by using a 10-turn pot in place of the sensor for initial setup purposes. Set the test pot to the middle value (882) then adjust P1 to 2mA. Reset the test pot to the highest value (1194), Then adjust P2 till LED10 comes on. After that, check that each LED is lighting within the appropriate sensor value range. It doesn't have to be exact, but close to the middle of each sensor value range.

You should only have to set P1 once. You may need to "Tune" P2 until the LEDs are tracking the test pot (sensor) values correctly.

Hope that helps...

eT :cool:
 

crutschow

Joined Mar 14, 2008
34,412
Below is a simulation of a simple bar-graph circuit which should do what you want (expect for the relay which can be added as you show).

R1 can be changed to a pot (connected as a rheostat) to allow calibration of the trigger span if needed (use a 500Ω pot is series with a 1.5kΩ resistor).

The circuit is ratiometric (bridge circuit) so its accuracy is basically independent of the power supply voltage. The thermistor circuit and LM3914 should be powered from the same voltage but the LEDs could be powered from a different supply. Note that a higher voltage for the LEDs can lead to significant dissipation in the LM3914 when all LEDs are lit, depending upon the voltage and the LED current value as determined by (R3). If necessary, a resistor can be added in series with the LEDs to reduce the LM3914 dissipation. The Dot mode will dissipate less power, of course, since only one LED is lit at a time.

Edit: R2 is the thermistor, of course.

Bar Graph.gif
 
Last edited:

crutschow

Joined Mar 14, 2008
34,412
...................
I got an analog meter that shows temperature from 40°C to 120°C, it's designed to be used in a car and runs on 12V, it comes with a temperature sensor that has a negative temperature coefficient, but I have a different temperature sensor installed that I really need it to work with instead, and that one unfortunately has a positive temperature coefficient.

But there is a simpler way of looking at this problem: connecting the temperature signal input to a voltage source, the meter stays at its lowest reading of 40°C when connected to zero volts, and the meter shows its highest reading of 120°C when connected to +12V. An in-between voltage of course would show a corresponding reading somewhere in-between. The meter itself can be powered by the same 12V source.
.........................
Here's a simulation of a circuit that should do what you want. It uses a bridge circuit to convert the thermistor voltage to a 0V to 12V signal. The op amp can be just about any rail-rail type op amp that has a 18V rating or better.

Edit: The op amp can only drive a load of about 2k ohm or greater. If the meter resistance is less than that, then you will need to add a emitter follower buffer amp at the output (inside the feedback loop).

Thermistor Bridge.gif
 
Last edited:

Thread Starter

zap250

Joined May 4, 2014
6
@crutschow:

I see you're using a basic voltage divider for the Rlo and Rhi inputs to the LM3914 - I didn't know you could do that, as the example schematics always show the Rhi linked to the Ref and changing the LED current changes the voltage setpoints as well. That's what made it so confusing to me - using a separate voltage divider definitely makes it simpler and more logical in my mind - thanks!

I was wondering one thing - the voltage divider formed by R1 and R2 would result in a signal voltage ranging from 1.257V to 1.869V at pin 5, but the voltage divider connected to Rlo and Rhi appears it would set Rlo at 1.246V and Rhi at 1.867V. Is this to compensate for the influence of the LED current on the input accuracy? Or some other reason? I'm curious about that.
 

eetech00

Joined Jun 8, 2013
3,942
The circuit is ratiometric (bridge circuit) so its accuracy is basically independent of the power supply voltage. The thermistor circuit and
I'm wondering about the sensor linearity using a voltage source instead of current source(?). Data sheet recommends 2ma for best linearity.

Your thoughts please? Thx...

If necessary, a resistor can be added in series with the LEDs to reduce the LM3914 dissipation. The Dot mode will dissipate less power, of
Why not just change the value of R3?:rolleyes:

Also,

Wondering about some of the resistor values. Probably need 1% for most of these, although I don't think there is a 626 ohm resistor.

Also tried the circuit in a simulation. The first LED wouldn't light until Sensor = 722 ohms.

eT:)
 
Last edited:

crutschow

Joined Mar 14, 2008
34,412
I see you're using a basic voltage divider for the Rlo and Rhi inputs to the LM3914 - I didn't know you could do that, as the example schematics always show the Rhi linked to the Ref and changing the LED current changes the voltage setpoints as well. That's what made it so confusing to me - using a separate voltage divider definitely makes it simpler and more logical in my mind - thanks!
The examples are only examples. The voltage setpoints and the LED current can be set independently. If you look at the internal schematic for the LM3914, the connections of the circuit I used will make more sense.
I was wondering one thing - the voltage divider formed by R1 and R2 would result in a signal voltage ranging from 1.257V to 1.869V at pin 5, but the voltage divider connected to Rlo and Rhi appears it would set Rlo at 1.246V and Rhi at 1.867V. Is this to compensate for the influence of the LED current on the input accuracy? Or some other reason? I'm curious about that.
The signal voltage goes from 1.193V to 1.869V for the simulated temperature change of 30°C to 130°C. Rlo and Rhi are set to 1.248V and 1.863V, to light the first LED at 40°C and the last at 130°C.
 
Last edited:

crutschow

Joined Mar 14, 2008
34,412
I'm wondering about the sensor linearity using a voltage source instead of current source(?). Data sheet recommends 2ma for best linearity.
Yes, you need a constant current for best linearity. But for the the op's limited temperature range and accuracy requirements, a voltage source is sufficient. If you look at the output voltage V(out) plot over the op's thermistor resistance range, you can see that the voltage change appears quite linear with resistance. The rule is don't make the circuit more complex than the requirements dictate, aka KISS.

crutschow: If necessary, a resistor can be added in series with the LEDs to reduce the LM3914 dissipation. The Dot mode will dissipate less power, of
Why not just change the value of R3?
Because that changes the LED current. For a given desired LED current the resistor in series will reduce the dissipation.
Wondering about some of the resistor values. Probably need 1% for most of these, although I don't think there is a 626 ohm resistor.
Yes, they should be 1%. The 626Ω resistor should be 619Ω.
Also tried the circuit in a simulation. The first LED wouldn't light until Sensor = 722 ohms.
In my simulation the first LED reaches full current at 672Ω, as you can see, so don't understand the discrepancy. :confused:
 
Last edited:

eetech00

Joined Jun 8, 2013
3,942
Hi crutschow

Thanks for you reply..

Yes, you need a constant current for best linearity. But for the the op's limited temperature range and accuracy requirements, a voltage source is sufficient. If you look at the output voltage V(out) plot over the op's thermistor resistance range, you can see that the voltage change
I didn't see anything from the op regarding voltage regulation.
I guess I'm skeptical that the sim really reflects the sensor
linearity under actual op conditions, but, ok.

appears quite linear with resistance. The rule is don't make the circuit more complex than the requirements dictate, aka KISS.
Good rule. I try to follow it too...:)

Because that changes the LED current. For a given desired LED current the resistor in series will reduce the dissipation.
I don't understand, isn't reducing the LED current the idea here?:rolleyes:

eT
 

crutschow

Joined Mar 14, 2008
34,412
I didn't see anything from the op regarding voltage regulation.
Since it uses a bridge circuit, the design is basically insensitive to the actual supply voltage, as I previously noted.
I guess I'm skeptical that the sim really reflects the sensor
linearity under actual op conditions, but, ok.
Skepticism of a simulation is always good. But the sim shows the V(out) for a simple two resistor voltage divider which is unlikely to be in error. I feel confident it is a valid reflection of the actual sensor operation (assuming that the resistance values taken from the thermistor specification sheet are correct).
I don't understand, isn't reducing the LED current the idea here?
No. The LM3914 outputs are constant current (that's why no resistors are required in series with the LEDs). The added resistor would be just to reduce the voltage drop across the LM3914 LED constant-current drivers and transfer some of the dissipation from the LM3914 to the resistor.
 
Last edited:

eetech00

Joined Jun 8, 2013
3,942
Hi again..:)

No. The LM3914 outputs are constant current (that's why no resistors are required in series with the LEDs). The added resistor would be just to reduce the voltage drop across the LM3914 LED constant-current drivers and transfer some of the dissipation from the LM3914 to the resistor.
Yes, but the same thing can be accomplish by changing the value of R3.
It is true that the outputs are constant current but the max current delivered to each and ALL outputs is still regulated by the value of R3 connected to the RefOut pin. :confused:

eT
 

crutschow

Joined Mar 14, 2008
34,412
.....................
Yes, but the same thing can be accomplish by changing the value of R3.
It is true that the outputs are constant current but the max current delivered to each and ALL outputs is still regulated by the value of R3 connected to the RefOut pin. :confused:
No, the same thing cannot be accomplished. R3 controls the nominal constant-current, (what is max current?), of each LED driver and I don't want to change that current. I just want to reduce the voltage drop, and thus the power dissipated, across the constant-current drivers (which does not appreciably change the current). Do you understand how a constant-current driver works? :confused:
 

eetech00

Joined Jun 8, 2013
3,942
No, the same thing cannot be accomplished. R3 controls the nominal constant-current, (what is max current?), of each LED driver and I don't want to change that current. I just want to reduce the voltage drop, and thus the power dissipated, across the constant-current drivers (which does not appreciably change the current). Do you understand how a constant-current driver works? :confused:
Yes. I understand how they work....just didn't understand your intent.:rolleyes:

eT
 
Top