I have been looking at super capacitors to a long life power supply for a flashing stop sign that is charged by solar panels. It runs on 4.8v NiH batteries with 14ah of capacity or 67 wh.
While looking at capacitors I found a formula for available energy as (q*v1^2/2)-(qv2^2/2). So if I buy one of those 2.7v 500f capacitors and run it from 2.7v to 1v I should get (500*2.7^2/2)-(500*1^2/2) joules of energy, them decide by 3600 to get 0.43 wh. If I add one in series to go from 5.4 to 1 I now get 4x the whs at 1.95 where as in parallel I just get 0.87 whs. By my calculations 9 would charge to 24v and give me 40 whs. ( I am aware i'll need to way over provision due to parasitic drain).
My brain then says, if energy increases by the square with voltage but only linearly with capacitance, why would you ever put a capacitor in parallel when you could put it in series. is there less (My guesses are they must be power related ,ie higher power density and charge speed).
I guess my question is , if I can charge to a higher voltage, wouldn't it make sense to put several large capacitors in series to get a high voltage and then use a switching regulator to drop the voltage down to something I can use, rather than putting them in series and using a boost converter to push them up to something I can use. Do I really get c*n^2 times more usable energy vs c*n times more energy per capacitor in serial vs parallel?
Any other thoughts on the project would be appreciated
Note the sign is running LEDs on a 50% duty cycle every 5 seconds.
While looking at capacitors I found a formula for available energy as (q*v1^2/2)-(qv2^2/2). So if I buy one of those 2.7v 500f capacitors and run it from 2.7v to 1v I should get (500*2.7^2/2)-(500*1^2/2) joules of energy, them decide by 3600 to get 0.43 wh. If I add one in series to go from 5.4 to 1 I now get 4x the whs at 1.95 where as in parallel I just get 0.87 whs. By my calculations 9 would charge to 24v and give me 40 whs. ( I am aware i'll need to way over provision due to parasitic drain).
My brain then says, if energy increases by the square with voltage but only linearly with capacitance, why would you ever put a capacitor in parallel when you could put it in series. is there less (My guesses are they must be power related ,ie higher power density and charge speed).
I guess my question is , if I can charge to a higher voltage, wouldn't it make sense to put several large capacitors in series to get a high voltage and then use a switching regulator to drop the voltage down to something I can use, rather than putting them in series and using a boost converter to push them up to something I can use. Do I really get c*n^2 times more usable energy vs c*n times more energy per capacitor in serial vs parallel?
Any other thoughts on the project would be appreciated
Note the sign is running LEDs on a 50% duty cycle every 5 seconds.
