I understand that in principle, a half-wave rectifier-based meter would read 45% of the RMS value. But that assumes ideal diodes. With a circuit using real diodes (1N34A), I get very poor linearity, which I assume is due to diode voltage drop. Is that correct? Since the scales of commercial analog meters seem pretty linear, how do they avoid the results I'm getting?