Semiconductor heating

Discussion in 'General Electronics Chat' started by wayneh, Aug 23, 2016.

  1. wayneh

    Thread Starter Expert

    Sep 9, 2010
    12,154
    3,061
    Simple question: As a CPU gets hotter during use, does the efficiency go up or down? By efficiency I mean the computations done per heat produced.

    If efficiency goes down with increasing temperatures , it seems like it would overheat from the positive feedback. High temp causes more heat to be produced, raising the temperature still higher.

    If efficiency goes up with temperature, then ideal conditions would be as hot as practical. You'd need less heat dissipation overall and it would be much easier to move heat from the smoking CPU.
     
  2. crutschow

    Expert

    Mar 14, 2008
    13,056
    3,245
    I believe generally CMOS circuits (which I believe all modern CPUs are) gets less efficient as the temperature goes up (MOSFET ON resistance has a positive temperature coefficient).
    From that it could overheat if not properly cooled, but they typically have a fan cooled heat-sink to keep their temperature below the critical point.

    On my HP desktop PC I notice that the fan speed increases when the computer is doing computational intensive applications, as apparently they speed up the core for such calculations, which dissipates more power.
     
  3. DickCappels

    Moderator

    Aug 21, 2008
    2,664
    634
    And it would follow that the maximum clocking rate would be reduced at high temperatures. Or put another way, you could clock it faster when cooler. The amount of power dissipated is a function of capacitance, voltage and frequency, which are only an indirect function of temperature as mentioned above.
     
  4. wayneh

    Thread Starter Expert

    Sep 9, 2010
    12,154
    3,061
    So at a given clock rate, running a constant task, do we agree that the amount of heat being generated actually increases with temperature? That means the cooling system has to be very aggressive to counteract the positive feedback of the "heater" (the CPU).
     
  5. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,574
    2,549
    I'm under the impression that such is the case. It's called runaway heating, and it increases exponentially if it's not aggressively taken care of. I've seen that phenomena manifest in hydraulic systems, but it wouldn't surprise me if it applied to electronic circuits too.
     
  6. DickCappels

    Moderator

    Aug 21, 2008
    2,664
    634
    If the energy per clock cycle is 1/(C(V^2)) and power dissipation and therefore heat generated is equal to clock frequency times the energy per clock cycle, I don't see how changing the temperature can have an effect on heat being generated.

    What mechanism are you considering as a possible cause for heat generation to increase?
     
  7. wayneh

    Thread Starter Expert

    Sep 9, 2010
    12,154
    3,061
    I was wondering about the temperature coefficient in semiconductors. A CPU is a boatload of transistors. If they all have their resistance go up or down in response to a temperature change, this should affect heat formation. Hold all else equal (clock, CPU load, etc.).
     
  8. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,574
    2,549
    Well, among other things, I guess it has something to do that, in a chip, the electronic pathways are distributed throughout its volume, and it only has its outer surface (or area) as its only means of dissipation.
     
  9. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,913
    2,181
    DickCappels and cmartinez like this.
  10. crutschow

    Expert

    Mar 14, 2008
    13,056
    3,245
    As NS noted, leakage currents are a large factor in power dissipation and that would go up with temperature.
    The other large power dissipator is charging and discharging of the various parasitic capacitances at the high clock rate, and that likely doesn't change much with temperature.
    The change in ON resistance with temperature actually would have little effect on the total power dissipation, only on the maximum switching speed.
     
    cmartinez likes this.
  11. DickCappels

    Moderator

    Aug 21, 2008
    2,664
    634
    Yes, we can agree that dissipation increases with temperature.
     
  12. wayneh

    Thread Starter Expert

    Sep 9, 2010
    12,154
    3,061
    I'm not sure if you're joking about the level of agreement here? I mean, dissipation obviously increases and we can apparently all agree on that since it is self-evident. :p

    If you were not making a joke, are you weighing in on the side of heat production – joules per second – increasing with temperature?
     
  13. DickCappels

    Moderator

    Aug 21, 2008
    2,664
    634
    In post #6 I wondered what would make power dissipation increase with temperature. In post #9 nsaspook reminded us about leakage current and since we know that leakage current increases with temperature, with rising temperature power dissipation will increase if other factors are held constant. Crutschow nicely summarizes the factors contributing to power dissipation in post #11. Being in agreement, I withdraw my skepticism.
     
    wayneh likes this.
  14. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,546
    1,252
    That is the answer to your question in post #1. There is no "positive feedback" mechanism between the CPU die temperature and the rate at which it processes instructions. The clock frequency sets the CPU "speed", nothing else. Some systems measure the die temperature and alter the clock frequency to reduce heat, but that is an externally applied control loop and can have a simple or complex transfer function.

    In a general sense, yes, computations done per heat produced is a very real concern in systems designed for low power, or high reliability, or rugged environment applications. This is why all modern microprocessors have extensive thermal models available.

    ak
     
  15. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,574
    2,549
  16. wayneh

    Thread Starter Expert

    Sep 9, 2010
    12,154
    3,061
    I can't tease out which side you're on. Does more heat get produced when a CPU is operated at a higher temperature? Yes, no change, or no it goes down. Again, all else equal.
     
  17. MrSoftware

    Active Member

    Oct 29, 2013
    504
    124
    To boil it down; if leakage current increases with temperature in a CMOS transistor, and if leakage current is the major cause of heat in a CMOS based CPU, then with everything else kept equal the rate of heat production in a CMOS based CPU will increase as its temperature increases.
     
  18. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,913
    2,181
    I keep hearing that silicon is a dead-end. Eventually something will beat silicon based computing but almost everyone of these 'new' discoveries have been rolled back into silicon based manufacturing and extended its life time for another 20 years.

    https://cosmosmagazine.com/technology/why-silicon-computers-rule
     
  19. wayneh

    Thread Starter Expert

    Sep 9, 2010
    12,154
    3,061
    That seems to be the consensus.
     
  20. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,574
    2,549
    I'm guessing that the industry is trying to avoid the switch to other materials, since the cost of developing new equipment and processes for the new materials, and replacing the old technology, would be substantial.
     
Loading...