A question came up regarding transmitter power. If one were to compare (by measuring with an AC voltmeter) the signal at the antenna output of 2 transmitters, same frequency (say a simple sine wave at 150 MHz), one putting out 1 Watt, the other 10 Watts:
Since the frequency is the same, the only remaining variable is the RMS voltage. Is there a way to calculate what that would be for any given power output?
Will the voltage be 10 time greater for the higher power transmitter, or are there other determining factors?
Will the voltage be the same for transmitters of equal power, but different frequencies?
Finally, does the 50ohm transmission line impedance have any role in determining the measured voltage, if it were to be measured at the antenna input versus the transmitter output?
Since the frequency is the same, the only remaining variable is the RMS voltage. Is there a way to calculate what that would be for any given power output?
Will the voltage be 10 time greater for the higher power transmitter, or are there other determining factors?
Will the voltage be the same for transmitters of equal power, but different frequencies?
Finally, does the 50ohm transmission line impedance have any role in determining the measured voltage, if it were to be measured at the antenna input versus the transmitter output?