Does an amplifier's signal to noise ratio need to be better than that of its source?

Thread Starter

skip.ele

Joined Nov 27, 2011
53
I am building an amplifier for an electret microphone. The microphone has a signal to noise ratio of 60db. What is the minimum signal to noise ratio I should target for my amp based on the microphone I'm using? Does using an amplifier with far better S/N than that of the source buy me anything? Should I shoot for a matching S/N and call it good enough?

thanks.

skip
 

donpetru

Joined Nov 14, 2008
185
In the 21st century an audio amplifier to be truly powerful it must have an SNR of at least 100dB. And to reach this signal to noise ratio will have to pay special attention to building power supply. Hardly can get such a SNR using the switching power supply source. So, being the first, uses a classic power supply (transformer, bridge rectifier and a battery large capacitors). And last, use a performance diagram audio amplifier.

Then another aspect, attention and other audio floors (including performance microphone that is not too good).

What audio application where you want to use that microphone?
 

crutschow

Joined Mar 14, 2008
34,281
For lowest noise you want the amp S/N better than your source so it adds minimum additional noise to the signal. If the amp random noise equals the signal random noise then the S/N would be reduced by a factor of 1/√2.
 

WBahn

Joined Mar 31, 2012
29,976
The total SNR will be dominated by whichever block in the processing chain has the lowest SNR. You want the blocks other than your worst one to be better (not the same) so that they have minimal adverse effect, but there is no real benefit to making them significantly better since the worst one is your limiting factor.
 
Top