Why does increasing the voltage supply speed circuits up?

Thread Starter

Jatepola

Joined Apr 7, 2010
1
So, why does increasing the voltage supply speed circuits up? I see that it increases power consumption due to P=C*V^2*f, and I see that increasing the frequency obviously makes everything run faster, but I can't see why increasing the voltage reduces the propagation delay in the circuit.

Is it just because by increasing the voltage and keeping the resistance constant, we're actually increasing the current and, therefore, electrons move faster, internal capacitors charge and discharge faster or is it something more complicated?

Any help will be much appreciated, thanks.

(I'm talking about digital CMOS circuits)
 

Ghar

Joined Mar 8, 2010
655
The time required to charge a capacitor is independent of voltage.
Though you get a higher current you need to go up to a higher voltage which gives you no net effect.
Remember that time constant is RC.

The speed up comes from reducing the on-resistance of your transistors. The higher gate-source voltage makes them "turn on more".
That reduction in resistance causes the speed up.

You could also say that you're increasing the current output of the transistors, if you're thinking of it as a current source charging a capacitor.

One or the other is the appropriate idea depending on the region of operation. With CMOS the transistors go through all 3 regions so both ideas work at some point.
 
Last edited:

Audioguru

Joined Dec 20, 2007
11,248
Ordinary CD4xxx Cmos logic is fairly slow unless the supply is 18V.
But 74HCxxxx Cmos is also low power but is high speed and works from a supply that is 2V to 6V.
 
Top