Simple question: As a CPU gets hotter during use, does the efficiency go up or down? By efficiency I mean the computations done per heat produced. If efficiency goes down with increasing temperatures , it seems like it would overheat from the positive feedback. High temp causes more heat to be produced, raising the temperature still higher. If efficiency goes up with temperature, then ideal conditions would be as hot as practical. You'd need less heat dissipation overall and it would be much easier to move heat from the smoking CPU.