Since the frequency is the same, the only remaining variable is the RMS voltage. Is there a way to calculate what that would be for any given power output?

Will the voltage be 10 time greater for the higher power transmitter, or are there other determining factors?

Will the voltage be the same for transmitters of equal power, but different frequencies?

Finally, does the 50ohm transmission line impedance have any role in determining the measured voltage, if it were to be measured at the antenna input versus the transmitter output?