Measuring temperature rise and making assumptions

Thread Starter

MikeA

Joined Jan 20, 2013
364
This is more of a materials question than electronics, but say you have an enclosed device that warms up inside from ambient 70F to 100F fully on and stabilized. Does it mean that at 0F ambient, it will warm up to 30F inside? And if it's 300F ambient it will warm up to 330F?

My common sense says yes that is theoretically correct, but there must be some external factors that I'm not accounting for that would upset that linearity?
 

MrChips

Joined Oct 2, 2009
30,488
No. That is not even theoretically correct.

If you have a totally isolated system and you are inputting x watts of power, theoretically the temperature will keep on increasing.
The fact that a final temperature is attained means that energy is being lost. Hence it is not a fully closed system.

The rate at which energy is lost from a system depends on many factors, a primary factor being the ambient temperature.
Therefore, no, the rate at which energy is lost is not linear.
 

Papabravo

Joined Feb 24, 2006
21,003
As we know from our class(es) in thermodynamics it depends on the nature of the enclosure. If we have an adiabatic enclosure, it means no heat transfer is possible from the system inside the enclosure to the surroundings, what we are calling the "ambient environment". In this case the temperature rise inside is due exclusively to heat dissipation of the internal components and the rise over ambient will be determined by the power input.

If the enclosure is non-adiabatic, there will be some amount of heat flow from the warmer environment to the cooler environment. If however, the surroundings can dissipate a large quantity of heat, then the temperature rise in the surroundings will reach an equilibrium temperature at any finite level of power input.
 

Thread Starter

MikeA

Joined Jan 20, 2013
364
The rate at which energy is lost from a system depends on many factors, a primary factor being the ambient temperature.
Therefore, no, the rate at which energy is lost is not linear.
That's what I was asking. Is the rate at which energy is lost the same at 0F as at 70F given everything else stays the same? In theory.

And what external factors might make theory deviate from practice.
 

MrChips

Joined Oct 2, 2009
30,488
For the temperature of the system to stabilize, energy must be transferred out of the system as heat.
The three principal methods of heat transfer are thermal conduction, convection, and radiation.
Heat transfer by thermal radiation is proportional to the 4th power of temperature (known as black body radiation). Hence the temperature rise will be lower at higher temperatures.
 

Thread Starter

MikeA

Joined Jan 20, 2013
364
Hence the temperature rise will be lower at higher temperatures.
So say I measured 30F rise with 70F ambient. That means at 100F ambient the rise will be less than 30F?

Can I calculate what the rise will be at 100F with the measurements made at 70F ambient?
 

MrChips

Joined Oct 2, 2009
30,488
So say I measured 30F rise with 70F ambient. That means at 100F ambient the rise will be less than 30F?

Can I calculate what the rise will be at 100F with the measurements made at 70F ambient?
I don't think so because we don't know all the heat loss mechanisms.
 

Thread Starter

MikeA

Joined Jan 20, 2013
364
I don't think so because we don't know all the heat loss mechanisms.
All of them. Conduction, convection, and radiation. :cool:

Say this is a metal box, 10"x10"x10" sitting on a table in a infinitely large room. The box has electrical things inside that create 10W of heat. When it's 70F in the room, the inside of the box warms up to 100F.

If the room is at say 200F ambient, is there a way to estimate (perhaps not with extreme precision) what the temperature rise inside the box will be if everything is exactly the same except the ambient room temperature?
 

wayneh

Joined Sep 9, 2010
17,493
The three modes of heat transfer all depend on ∆T. Before computers, the approach to find the solution was to guess at a ∆T at steady state, and then estimate the heat transfer by each mode using that assumed temperature of steady state. You'd get three wattage transfer results, one for each mode, and then add them up. If you got lucky and calculated the total heat transfer at 10W, you were done. Otherwise you'd adjust your estimate of the temperature and repeat until it all converged.

I guess the procedure is basically the same now, just a lot faster!

The difficulty in practice comes from not knowing the accurate heat transfer coefficients, and the emissivity for radiative transfer. You mentioned the box is sitting on a table. Well that matters, and a table made of wood is a lot different than one made of steel, or marble. So in practice these problems get solved with a bit of hand waving and SWAGing. If you need accuracy, there are many reference works that address heat loss due to conduction and convection from, for instance, a cube-shaped object. You rely on the hard work of those that have gone before.

If you really need an accurate result, you collect data and model it. Nothing beats data.
 

Thread Starter

MikeA

Joined Jan 20, 2013
364
The difficulty in practice comes from not knowing the accurate heat transfer coefficients, and the emissivity for radiative transfer.
I understand now that there are non-linearities at all temperatures, but what are talking about for my example? Fraction of one percent? 10%?

Based on experience, how close to reality is this statement: if temperature rise was measured at 70F ambient and was 30F, it will be 30F temperature rise at 100F ambient. Approximately how far off can that statement be given the "unknowns"?
 

wayneh

Joined Sep 9, 2010
17,493
I understand now that there are non-linearities at all temperatures, but what are talking about for my example? Fraction of one percent? 10%?

Based on experience, how close to reality is this statement: if temperature rise was measured at 70F ambient and was 30F, it will be 30F temperature rise at 100F ambient. Approximately how far off can that statement be given the "unknowns"?
Were it not for radiation, it would be fairly accurate to say that you need a particular ∆T in order to transfer a particular wattage. In other words, conduction and convection are pretty much linear with respect to ∆T.

Radiation from a warm object to its surroundings is a function of ∆(T to the fourth power). Your measurement was at 21°C ambient and 38°C, or 294 K and 311 K. So your radiation effect was (311^4 - 294^4) • k •e = 1.884x10^9 • k • e where k is some constant and e is the emissivity, also constant within reason. At higher ambient, 311 K and 328 K, that factor is (328^4 - 311^4) • k •e = 2.219x10^9 • k • e. That means the transfer of heat by radiation will have increased by 18% by going to higher temperature.

Heat transfer due to radiation is probably less than half the total. That's just a hunch. That means total heat transfer will increase but less than 10% as the ambient temperature goes up from 70F to 100F, meaning ∆T may actually be a little less than 30F° at the higher temperature.
 

Thread Starter

MikeA

Joined Jan 20, 2013
364
Thank you! That's exactly what I was looking for. It also sounds like with two empirical measurements at different ambient temperatures, if conduction and convection are almost linear, it would be easy to calculate radiation as a percentage of total energy loss.

But even without doing that exercise, in the context of seeing how hot a piece of equipment will get at various temperatures, doing a single measurement at lower temperatures is a good enough since temperature rise will always be lower at higher ambient temperatures.
 

wayneh

Joined Sep 9, 2010
17,493
Thank you! That's exactly what I was looking for. It also sounds like with two empirical measurements at different ambient temperatures, if conduction and convection are almost linear, it would be easy to calculate radiation as a percentage of total energy loss.

But even without doing that exercise, in the context of seeing how hot a piece of equipment will get at various temperatures, doing a single measurement at lower temperatures is a good enough since temperature rise will always be lower at higher ambient temperatures.
I think the only caveat to that is that the heat production by some device may not stay constant as the temperature rises.
 

hrs

Joined Jun 13, 2014
392
Does it mean that at 0F ambient, it will warm up to 30F inside? And if it's 300F ambient it will warm up to 330F?
Yes, pretty much. The influence of radiative heat transfer is grossly overstated. The radiated power per unit area for a theoretical black body is s * dT^4 where s is the Boltzmann constant. 100 - 70 => dT = 17°C. P = 5.67E-8 * 17^4 = 5 mW/m^2. A practical non-black body will radiate less.

Typical free convection heat transfer coefficient values may range from 5 to 20 W/(m^2*K). Let's assume 5. h * dT = 85 W/m^2. I calculate that at a dT of 100K radiative heat transfer will be ~1% of total. At a dT of 200K it will be ~8%.

I don't expect convective heat transfer for a fixed dT to change much with moderate changes in absolute temperature.
 
Last edited:
Top