I understand that if I have a transformer rated for a certain voltage at a certain amperage that it is usually the AC rating that's listed, and if the output is full wave rectified the available current will be reduced - I think the rule of thumb I heard was 80% of the AC current? I'd appreciate it if anyone could tell me the correct rule of thumb and perhaps a reference that gives a derivation of this rule, as I'm on a kick lately where I can't take things at face value without seeing a proof.