I don't understand *why* clipping between an audio amplifier and a speaker happens due to a mismatch between an audio amplifier output and speaker power rating. Nothing I've read explains why - just that it does. Well, I need a better answer than "cuz the internet said so", ha ha.
The output of the power amp "sees" the impedance, and with the signal voltage we get a value of current. If I turn up the volume, I get more voltage, and more current. I don't understand why this would be any different if the speaker's power rating was perfectly matched to the amplifier or if it was 10 times the power amp - what is the amplifier "seeing" that would make it clip with one, but not the other?
How does the amplifier know this impedance source's power handling is rated 10 times higher than the amplifier could provide?
In my case, I have an 18W RMS stereo driving a pair of 100W RMS speakers (gift from family member who meant well). Why would clipping happen on this circuit any differently than with an 18W stereo driving a pair of perfectly matched 18W speakers?
The output of the power amp "sees" the impedance, and with the signal voltage we get a value of current. If I turn up the volume, I get more voltage, and more current. I don't understand why this would be any different if the speaker's power rating was perfectly matched to the amplifier or if it was 10 times the power amp - what is the amplifier "seeing" that would make it clip with one, but not the other?
How does the amplifier know this impedance source's power handling is rated 10 times higher than the amplifier could provide?
In my case, I have an 18W RMS stereo driving a pair of 100W RMS speakers (gift from family member who meant well). Why would clipping happen on this circuit any differently than with an 18W stereo driving a pair of perfectly matched 18W speakers?