I was making the point that we may have 130dB full scale, we may be only able to hear 16 bits of that at any time (or whatever you want - maybe 24 bits.) But we certainly cannot hear 43 bits. I think you misunderstood the full scale range, just like the human eye (which can see 16 million distinct colors, or about 7-8 bits - but, the gamut is much wider than say sRGB and of course we also have rod cells which pick up luminance), the human ear cannot understand all 43 bits of full scale range. Additionally, the human ear has a logarithmic response, in that a really quiet sound can be overwhelmed by a louder sound.Yes Tom you just showed the point I was tying to make exactly. 14 bit fullscale audio for a 1bit window is only... 1 bit. The audio you're listening to doesn't automagically take advantage of all the bits, only the human ear does that in the real world with real dynamic audio, or an equivalent bit depth which is 43 bits.
We can hear in the whole band at the same time, but the strongest signal is the only one that's going to be noticed. In a delicate ensemble, only the effective dynamic range is important. Most people are deaf to such nuance because the dynamic range of typical modern music is so incredibly high it can't take advantage of the nuance of human hearing. You have to have eclectic tastes in music to notice.
Fidelity is all about the precision of the reproduction at EVERY range that is acoustically noticeable at a given moment. For the audio range of human perception 24 bits is perfectly reasonable.
My el cheapo multimeter has approximately 12 bits resolution (4,000 counts) and measures from 100µV (1 count on 400.0mV range) to 1000V (1000 counts on 1000V range), so would you say it has 80 dB range and therefore ~26 bits of resolution? Of course not.
Now, I'm sure there are additional advantages to working in 24 bits over 16 bits. The audio is more preserved if changes are made to it, and it's less sensitive to additional error (e.g. noise, or error in the last few bits.)
Last edited: