Semiconductor heating

Thread Starter

wayneh

Joined Sep 9, 2010
17,498
Simple question: As a CPU gets hotter during use, does the efficiency go up or down? By efficiency I mean the computations done per heat produced.

If efficiency goes down with increasing temperatures , it seems like it would overheat from the positive feedback. High temp causes more heat to be produced, raising the temperature still higher.

If efficiency goes up with temperature, then ideal conditions would be as hot as practical. You'd need less heat dissipation overall and it would be much easier to move heat from the smoking CPU.
 

crutschow

Joined Mar 14, 2008
34,459
I believe generally CMOS circuits (which I believe all modern CPUs are) gets less efficient as the temperature goes up (MOSFET ON resistance has a positive temperature coefficient).
From that it could overheat if not properly cooled, but they typically have a fan cooled heat-sink to keep their temperature below the critical point.

On my HP desktop PC I notice that the fan speed increases when the computer is doing computational intensive applications, as apparently they speed up the core for such calculations, which dissipates more power.
 

DickCappels

Joined Aug 21, 2008
10,186
And it would follow that the maximum clocking rate would be reduced at high temperatures. Or put another way, you could clock it faster when cooler. The amount of power dissipated is a function of capacitance, voltage and frequency, which are only an indirect function of temperature as mentioned above.
 

Thread Starter

wayneh

Joined Sep 9, 2010
17,498
So at a given clock rate, running a constant task, do we agree that the amount of heat being generated actually increases with temperature? That means the cooling system has to be very aggressive to counteract the positive feedback of the "heater" (the CPU).
 

cmartinez

Joined Jan 17, 2007
8,257
So at a given clock rate, running a constant task, do we agree that the amount of heat being generated actually increases with temperature? That means the cooling system has to be very aggressive to counteract the positive feedback of the "heater" (the CPU).
I'm under the impression that such is the case. It's called runaway heating, and it increases exponentially if it's not aggressively taken care of. I've seen that phenomena manifest in hydraulic systems, but it wouldn't surprise me if it applied to electronic circuits too.
 

DickCappels

Joined Aug 21, 2008
10,186
If the energy per clock cycle is 1/(C(V^2)) and power dissipation and therefore heat generated is equal to clock frequency times the energy per clock cycle, I don't see how changing the temperature can have an effect on heat being generated.

What mechanism are you considering as a possible cause for heat generation to increase?
 

Thread Starter

wayneh

Joined Sep 9, 2010
17,498
I was wondering about the temperature coefficient in semiconductors. A CPU is a boatload of transistors. If they all have their resistance go up or down in response to a temperature change, this should affect heat formation. Hold all else equal (clock, CPU load, etc.).
 

cmartinez

Joined Jan 17, 2007
8,257
If the energy per clock cycle is 1/(C(V^2)) and power dissipation and therefore heat generated is equal to clock frequency times the energy per clock cycle, I don't see how changing the temperature can have an effect on heat being generated.

What mechanism are you considering as a possible cause for heat generation to increase?
Well, among other things, I guess it has something to do that, in a chip, the electronic pathways are distributed throughout its volume, and it only has its outer surface (or area) as its only means of dissipation.
 

crutschow

Joined Mar 14, 2008
34,459
As NS noted, leakage currents are a large factor in power dissipation and that would go up with temperature.
The other large power dissipator is charging and discharging of the various parasitic capacitances at the high clock rate, and that likely doesn't change much with temperature.
The change in ON resistance with temperature actually would have little effect on the total power dissipation, only on the maximum switching speed.
 

Thread Starter

wayneh

Joined Sep 9, 2010
17,498
Yes, we can agree that dissipation increases with temperature.
I'm not sure if you're joking about the level of agreement here? I mean, dissipation obviously increases and we can apparently all agree on that since it is self-evident. :p

If you were not making a joke, are you weighing in on the side of heat production – joules per second – increasing with temperature?
 

DickCappels

Joined Aug 21, 2008
10,186
In post #6 I wondered what would make power dissipation increase with temperature. In post #9 nsaspook reminded us about leakage current and since we know that leakage current increases with temperature, with rising temperature power dissipation will increase if other factors are held constant. Crutschow nicely summarizes the factors contributing to power dissipation in post #11. Being in agreement, I withdraw my skepticism.
 

AnalogKid

Joined Aug 1, 2013
11,055
Hold all else equal (clock, CPU load, etc.).
That is the answer to your question in post #1. There is no "positive feedback" mechanism between the CPU die temperature and the rate at which it processes instructions. The clock frequency sets the CPU "speed", nothing else. Some systems measure the die temperature and alter the clock frequency to reduce heat, but that is an externally applied control loop and can have a simple or complex transfer function.

In a general sense, yes, computations done per heat produced is a very real concern in systems designed for low power, or high reliability, or rugged environment applications. This is why all modern microprocessors have extensive thermal models available.

ak
 

Thread Starter

wayneh

Joined Sep 9, 2010
17,498
In a general sense, yes, computations done per heat produced is a very real concern in systems designed for low power, or high reliability, or rugged environment applications. This is why all modern microprocessors have extensive thermal models available.
I can't tease out which side you're on. Does more heat get produced when a CPU is operated at a higher temperature? Yes, no change, or no it goes down. Again, all else equal.
 

MrSoftware

Joined Oct 29, 2013
2,202
To boil it down; if leakage current increases with temperature in a CMOS transistor, and if leakage current is the major cause of heat in a CMOS based CPU, then with everything else kept equal the rate of heat production in a CMOS based CPU will increase as its temperature increases.
 

nsaspook

Joined Aug 27, 2009
13,308
Here's a very interesting article, relevant to what's being discussed in this thread:

https://www.newscientist.com/articl...achine-learning-boom-means-we-need-new-chips/
I keep hearing that silicon is a dead-end. Eventually something will beat silicon based computing but almost everyone of these 'new' discoveries have been rolled back into silicon based manufacturing and extended its life time for another 20 years.

https://cosmosmagazine.com/technology/why-silicon-computers-rule
 

cmartinez

Joined Jan 17, 2007
8,257
I keep hearing that silicon is a dead-end. Eventually something will beat silicon based computing but almost everyone of these 'new' discoveries have been rolled back into silicon based manufacturing and extended its life time for another 20 years.

https://cosmosmagazine.com/technology/why-silicon-computers-rule
I'm guessing that the industry is trying to avoid the switch to other materials, since the cost of developing new equipment and processes for the new materials, and replacing the old technology, would be substantial.
 
Top