So, why does increasing the voltage supply speed circuits up? I see that it increases power consumption due to P=C*V^2*f, and I see that increasing the frequency obviously makes everything run faster, but I can't see why increasing the voltage reduces the propagation delay in the circuit.
Is it just because by increasing the voltage and keeping the resistance constant, we're actually increasing the current and, therefore, electrons move faster, internal capacitors charge and discharge faster or is it something more complicated?
Any help will be much appreciated, thanks.
(I'm talking about digital CMOS circuits)
Is it just because by increasing the voltage and keeping the resistance constant, we're actually increasing the current and, therefore, electrons move faster, internal capacitors charge and discharge faster or is it something more complicated?
Any help will be much appreciated, thanks.
(I'm talking about digital CMOS circuits)