Capacitor as power supply circuit

Thread Starter

Trakyan

Joined Mar 11, 2016
10
My goal is to create a circuit that takes the decaying voltage and current you'd usually get from a capacitor and turns it into a constant voltage and current source. I realised that this is what batteries are for however I'd like to find a way to do this as a pet project and that it might be better suited than a battery for my specific case.

So far I've looked around at voltage regulators and have toyed with the idea with overcharging the capacitor (i.e. if i need 10 volts, charge it to 20) and connecting it to a voltage regular that outputs ten volts. I figure this will give me a 10 volt power supply until the capacitor discharges to below 10 volts (I know that in practice voltage regulators require their input voltage to be a few volts higher than the intended output but you get the point). However, I'm not sure what would happen to the current in this case. If it stays stable then that's great, however if it doesn't I'm a bit lost as to how to handle this.

On a side note, would a high current load or a high voltage load drain a capacitor faster? Assuming both use the same amount of power (i.e high current load being 10 amps at 2 volts and high voltage load being 2 amps at 10 volts). On one hand I want to say they'd drain it at the same rate, however, the explanation they give in schools of how capacitors are charged by electrons gathering on the plates makes me think that a higher current would drain it quicker since it would take less time for the coulombs of charge on either plate to leave.

Thank you.
 

tcmtech

Joined Nov 4, 2013
2,867
Play with some numbers here and see what you find. Unless you have some huge capacitors and a tiny load the overall concept is very disappointing in terms of run time.

http://mustcalculate.com/

Oh yea, and if you try to charge a capacitor to double its rated voltage they tend to explode hence their having a voltage rating for a reason. :rolleyes:
 

Thread Starter

Trakyan

Joined Mar 11, 2016
10
I know that the run times are fairly small, I did some calculations of my own a while ago. Also I didnt mean charge it to double its rated voltage, i meant charge it to double the voltage required by the load.
 

AnalogKid

Joined Aug 1, 2013
10,987
The total energy in a capacitor is 1/2 x C x V^2. For example, a 1000 uF (0.001 F) cap charged to 20 V holds 0.5 x .001 x 20 x 20 = 0.2 Ws (watt-seconds). From here it is straightforward. Example, if the load is 0.1 W, it will run for 2 seconds. While this equation is technically correct, it gives misleading results because no conversion circuit that takes a wide range of input voltage and produces a constant output voltage can run on an input all the way down to 0 V.

When determining what a cap can do for real, figure out the high and low operating voltage range for the power converter, calculate the total capacitor energy at those two voltages, and subtract the two. That is the real energy you have to work with, and from there your can adjust the capacitor size to get the runtime you need.

Watts is watts. Whether high or low voltage, or high or low current, the capacitor holds watt-seconds and the load consumes watts. Accurately characterizing a complex load can be tricky, but once you know the power it requires, sizing the capacitor is easy. A capacitor is not a volumetrically efficient energy storage device so sometimes it's depressing, but still easy.

ak
 

BobTPH

Joined Jun 5, 2013
8,813
I know that the run times are fairly small, I did some calculations of my own a while ago. Also I didnt mean charge it to double its rated voltage, i meant charge it to double the voltage required by the load.
If you are talking about 20W, the run times will indeed be small. Probably less than a second.

Bob
 

Thread Starter

Trakyan

Joined Mar 11, 2016
10
That depends on how big the capacitor is and how much I charge it I believe. However my question wasn't about how long the run times would be it was if anyone knew how to achieve a constant voltage, constant current power supply from a capacitor.
 

Lestraveled

Joined May 19, 2014
1,946
Here is a simple equation that shows the relationship between volts, amps and time in a capacitor: Change in voltage/change in time = amps/capacitance

Example: if you draw 1 amp from a 1 farad cap, the voltage will fall at 1 volt per second.

Another example: Draw 10 milliamps from a 100uF cap, .01A/.0001F = 100 volts per second. In other words you would have to charge a 100 uF cap up to 100 volts in order to draw 10 ma for one second.
 

mcgyvr

Joined Oct 15, 2009
5,394
That depends on how big the capacitor is and how much I charge it I believe. However my question wasn't about how long the run times would be it was if anyone knew how to achieve a constant voltage, constant current power supply from a capacitor.
No different from any other power source except the voltage is decreasing with time..
So you need a circuit that can reduce the output voltage (buck) when needed and increase it when needed (boost)..
https://en.wikipedia.org/wiki/Buck–boost_converter

The responses so far are basically to let you know that its probably a bad idea in the first place depending on the load..

I can't remember what I was looking at the other day but it was a device that had a capacitor used as a battery to keep the memory in the device working when not powered.. Users of this device were upset because it would quickly loose its memory and needed to be reprogrammed each time they lost power for more than a few minutes.. A comparable product from another manufacturer used a battery and didn't have that problem.. Guess who sold more units..
 

OBW0549

Joined Mar 2, 2015
3,566
That depends on how big the capacitor is and how much I charge it I believe.
The biggest, beefiest, most absurdly bad-assed capacitor I could find on Digi-Key was a 6000 Farad (that is, 6 million microfarads) capacitor rated at 2.5 volts. It is 3" in diameter and 6.6" long, and costs $326 US. If you take eight of them (total cost: $2608) and wire them in series, that will give you a 750 Farad capacitor rated at 20 volts. If you charge it up to 20 volts, then let it drain down to just over 10 volts through a 10 volt regulator to power your load, you will end up delivering a total of 7500 coulombs (ampere-seconds) of charge, or a little over 2 amp-hours.

Or, you could just put 3 18650-type Li-ion batteries in series (total cost: ≈ $12) feeding your 10 volt regulator, and achieve the same thing.

Bottom line is, you're wasting your time with this idea.

However my question wasn't about how long the run times would be it was if anyone knew how to achieve a constant voltage, constant current power supply from a capacitor.
You can't.
 

AnalogKid

Joined Aug 1, 2013
10,987
If al of the capacitor voltage range is either greater than or less than the minimum voltage regulator input voltage, then either a buck or boost circuit will be the most efficient. If you want to get more "stretch" out of the capacitor by having its voltage discharge across the regulator input, such as the capacitor voltage discharging from 20 V to 5 V while powering a regulator with a 10 V minimum input, then you need a buck-boost circuit. This is more complex, but there are excellent controller chips from LT. More detail is difficult without more specific details about your system.

ak
 

Thread Starter

Trakyan

Joined Mar 11, 2016
10
From what I understand a buck boost circuit would give a changing output current. I was wondering if there was a way to discharge a capacitor from say 20 v to 10 v (the numbers are arbitrary) through a voltage regulator that outputs 10 volts with a constant current.
For the moment my only real goal with this is to make the power supply, I have no specific application for it as of yet I'll work that out depending on how much it can handle.

The reason I'm doing this is because I was looking for something which was capable of rapid and frequent charge/discharge cycles, such a thing usually damages batteries. As it stands I'll probably use it together with a battery.
 

AnalogKid

Joined Aug 1, 2013
10,987
You set the output voltage, and the load determines how much current it draws at that voltage. Or, you set the output currentand the load detrmines the voltage developed across it. You can't hold both voltage and current constant into a variable load.

ak
 

Thread Starter

Trakyan

Joined Mar 11, 2016
10
I get that V=IR but when you use a transformer or a buck/boost converter to change the voltage from say 10 to 20 volts with a constant 10 ohm load, you don't go from 1 amp to 2 amps as that would be an increase in power. As far as I'm aware if you boost the voltage you decrease the current and vice versa.

What I'm trying to do is drop the voltage from a cap charged to say 20 volts down to 10 volts and draw a fixed amount of current until that capacitor discharges to 10 volts. When charged to 20 volts a capacitor may be able to provide 20 volts and lets say 5 amps, but I'd like a way to limit it to just ten volts and lets say 1 amp until it discharges to a point where it can no longer provide 1 amp and 10 volts.
 

Lestraveled

Joined May 19, 2014
1,946
I just got it, and I am so sorry for you. To not be able to use a calculator to divide or multiply two numbers must be awful. And then to have all these people on this forum, who are trying to help you understand the folly of your effort and you not having the ability to comprehend them, must be incredibly frustrating. Considering what you have presented in this thread tells me that operating a gas pump must be quite a challenge for you. Also I am sure that you have a very exciting time arguing with recorded telephone messages.

So, please, we are not here to indulge your blatant misconception of what you don't understand, nor do we find your delusions entertaining. If you have something really you wish to accomplish, great, spit it out, we will help. Otherwise, seek counseling.
 
Last edited:

AnalogKid

Joined Aug 1, 2013
10,987
Wrong, in multiple ways.
I get that V=IR but when you use a transformer or a buck/boost converter to change the voltage from say 10 to 20 volts with a constant 10 ohm load, you don't go from 1 amp to 2 amps
Yes, you do, unless the total system power is limited at the input. If you double the output voltage into a constant load, the output current will double - ***and so will the input current***. You don't seem to grasp that it is the load that sets the output current, not the input. The case where the input current is so limited that it starves the output is bad design, not the norm. The case where the output current is electronically limited to prevent magnetics saturation or switching device failure is a design feature, not a fundamental condition of the physics.
as that would be an increase in power.
Yes, it is. P=I^2xR (Joule's Law). P=VI (Watt's Law). Either way, if the power source has the extra power available, then when you double the output voltage, you double the output current (in a constant load resistance).
As far as I'm aware if you boost the voltage you decrease the current and vice versa.
Only in a constant input power system. In you start with a 20 W power supply, then for a constant 20 W, if the output voltage goes up the output current goes down. But you are talking about a constant load system. If the load is a fixed 10 ohms, then when the output voltage goes up the output current will go up (Ohm's Law) and the output/load power will go up (Joule's and Watt's Laws).

With your clarified description of what you want to do, you can follow a buck/boost constant output voltage converter with a constant current limiter and you will have an output that has both a maximum voltage and maximum current. But there is only one load resistance that will cause both maxima. If the load is greater than that, the voltage stays the same and the current decreases. If the load is less than that, the current stays the same and the voltage decreases.

ak
 
Last edited:

NPN-1

Joined Mar 11, 2016
16
.On a side note, would a high current load or a high voltage load drain a capacitor faster?
There is no such thing as a high voltage load. A load is something that draws current from a source. Loads can draw low or high currents. As far as I know, a high voltage load makes no sense
 

tcmtech

Joined Nov 4, 2013
2,867
Very little of what he is saying and wanting makes any sense.

Either he has very little understanding of electrical terminology or has a very poor grasp of how electricity works in a circuit.:rolleyes:

taking 20 volts down to 10 volts at ay current is just a matter os sizing a buck converter circuit to handle the amperages involved. Same with using a boost converter or a combo converter.

For a 10 volt 1 amp load,

With the input at 20 volts (buck mode) there would be a just over 500 mA draw rising to just over 1 amp as the input drops to 10 volts.
After that (going into boost mode) the input amps would continually and proportionally rise again as the input voltage keeps dropping resulting at an input of just over 2 amps @ 5 volts or just over 10 amps @ 1 volt input and so on.
 
Top