I've been trying to figure out a defective car battery charger and am confused by the design strategy they've used (I don't have a schematic). So far as I can see the incoming 120V mains is first of all transformed into a 170v DC and then, using a power FET oscillator, converted back to ac and put through a transformer whose output is then rectified to give the approximately 13V needed to charge the battery.
Can somebody please explain why such an expensive route has been taken - rather than, as in the old days, just transforming and rectifying the incoming ac to 13V DC?
Can somebody please explain why such an expensive route has been taken - rather than, as in the old days, just transforming and rectifying the incoming ac to 13V DC?