# current vs. voltage out

Discussion in 'General Electronics Chat' started by new2circuits, Aug 10, 2009.

1. ### new2circuits Thread Starter Member

Apr 22, 2009
21
0
HI :

Can someone explain why the output of a circuit should be current at times and voltage at other times ?

Thanks

2. ### mik3 Senior Member

Feb 4, 2008
4,846
63
Can you give two examples as we can compare them?

3. ### t06afre AAC Fanatic!

May 11, 2009
5,939
1,222
I guess it is a figure of speech. If you have a circuits that draw very little current. You will say that the circuit need say 12 volt. Because the current is not an issue, and you may use any 12 volt source. But sometimes the output voltage is more or less given. Like the oven you use for cocking dinner. Everybody know that the mains voltage is specified. So you specify how much current that oven need only. But the correct thing is always to specify voltage and current. For example this circuit is 12 volt 4 ampere max power supply

4. ### kkazem Active Member

Jul 23, 2009
160
26
Hi,

I think I have a better explanation of why some circuit outputs are specified in terms of output voltage and others are specified in terms of output current. After more than 30 years as an electrical engineer and circuit designer, I'd better know.

Perhaps this is the best way I can explain it. Some circuits are intended to be constant-current, and obviously, the current will be specified. Other times, a circuit might put out a constant-voltage and therefore, a voltage will be specified. A good example of these are power supplies, like laboratory power supplies. However, having said that, a constant current output will only work over a certain voltage "compliance range" and a constant voltage output will only work up to its maximum output current. Therefore, in some sense, both current and voltage, or current and power or voltage and power need to be specified to completely characterize the output that we are discussing. Due to Ohms law, we can specify either voltage and maximum current, or voltage and maximum power and either way is valid. The same is true for constant current and voltage or constant current and power.

Not that I'm trying to confuse you, but most lab power supplies can be either constant current or constant voltage or even both at the same time. How can it be both at the same time? Let's say you have a lab supply set to an output voltage of 10 VDC and you have some varying load resistance connected to the lab supply's output. You can also set the lab supply's constant current output to 10 Amps DC for example. Now, if the load resistance is more than 1 Ohm, the lab supply will be in constant voltage mode as the current drawn will be less than 10 Amps. But if the load resistance drops down to less than 1 Ohm, then the lab supply will be in constant current mode. As the load starts from just above 1 ohm and drops to just below 1 ohm, there will be a transition region where the voltage-mode control and the current-mode control will be fighting each other for control. And the performance depends on how much loop gain each of the controls has (voltage-mode versus current-mode).

In circuit analysis, obviously, a current source will be specified with a current output and normally assumed to be an ideal current source for the purposes of the analysis, which means that there is no limit to the output voltage compliance range that it can handle. Similarly, with a voltage source in circuit analysis that is assumed ideal, there will be no limit on the amount of current that can drawn from the voltage source. When performing a PSPICE circuit analysis or equivalent, we can choose to make the voltage and current sources ideal or non-ideal.

That's about all I can say on that topic. If you have any specific questions about it, please reply.

Regards,
Kamran Kazem
kkazem