Capacitive voltage divider problem

Thread Starter


Joined Jun 29, 2017

I am working on a project which contains a capacitive voltage divider. I wanted to test it, so I made a simple schematic with 2 capacitors in series and a voltage source and I wanted to look at the voltage between the two. Let's say the supply is VDD and the capacitors are of the same value. So the voltage should be 1/2 VDD. The problem is, the voltage drops to nearly 0V after some time. Theoretically there is no way it should discharge like that. The capacitors and the source are ideal, I did the simulation in LTSpice. Does someone have an explanation for this and how could I fix it? I attached the schematic but as I said it is really simple.



Joined Sep 17, 2013
Hmm. Clearly the time scale is important. I wonder if the problem is something to do with rounding errors/tolerances/time-steps?
Here is an example of where some practical sense has to be used as opposed to just diving into the simulation software. The voltage across two capacitors in series connected to a DC source will ultimately mostly all appear across the one with lower leakage. The leakage resistance may be on the order of Giga-Ohms, but it is there. Also, you likely are trying to measure it on the actual circuit with a meter that has some finite resistance also. What are you trying to accomplish with this circuit? One solution is to place resistors across the capacitors to ensure voltage division.