I am making a variable power supply, and I have everything that I need in theory. I will take 120V AC from an outlet, then use two transformers, 115V to 60V and 115V to 60V centertapped, which will then be rectified and sent through a low pass filter to obtain three lines of +6V, +30V, and -30V, in reality the ouputs will be around +7.7V, +42.3 and -42.3. After that I plan to regulate the voltage with zener diodes, to obtain three stable lines to go to the part that modulates the voltage. I will use 3 555 timers that will probably get their power from the 6V line, using the timers I will be able to get 3 PWM signals, which will then be fed through a low pass filter in order to get a variable analog voltage (I originally planned to use a LTC6992 in order to get the PWM signal but it only comes in surface mount, and my hands are too shaky in order to get it on the breakout board). after I get the analog voltages, it will go into 3 op-amps each with an appropriate amount of gain, to get the desired voltage. After that it will go to the output of the power supply, after going through a voltmeter and ammeter. The issue I am currently facing is that in the simulation I am using, not only does the voltage after the regulator not stay at the desired zener voltage, whenever a load is applied, the output of the regulator drops proportionally to the load applied. Here is the circuit simulation, the left half is the rectification, and the second is the regulators. Note that whenever a resistor is put between an output line and the ground, the output voltage drops, is there any way to prevent it dropping? If that is not possible, is there a way to minimalize the output drop until the current draw reaches an amp?