How to understand HF transmitter power consumption

Thread Starter

PickyBiker

Joined Aug 18, 2015
98
My son bought a new mobile HF ham radio that has a maximum transmit power rating of 100 watts. We are trying to understand why the TX input power can be as high as 23 amps at 13.8VDC.

100w / 13.8v is only 7.25 amps. If you add an amp that it draws in RCV mode, you are still looking at about 8.25 amps.
My theory is that the radio's power rating is based on something like 100w audio input power to the final RF amplifier, not the total power output. That would mean the power consumed by the rf final stage would be added to the audio power.

Can someone tell us how exactly the TX power is calculated and why the max current draw is about 3x what would seem to be required?
 

sagor

Joined Mar 10, 2019
797
RF amplifiers are not 100% efficient. In fact, some are only around 70-75%, others as low as 50%. Worst case, for 100W out, you may be consuming 150W to close to 200W just for the final RF stages. That gives a current consumption in the 12A to 15A range alone, let alone powering other circuits in the radio.
The recommended 23A power supply is to ensure there is enough capacity for "peak" power usage to prevent voltage sag. Also, some radios have an accessory jack on the back that can supply up to 1A of power for accessories. Thus, recommended power supply to the radio is on the higher side for current recommendations.0
If you set the transmitter power down to a minimum level (say 5W), and measure the current draw, you will likely see 5A to 8A of current just to power the transmitter circuits without any major power out...
 
Last edited:

Thread Starter

PickyBiker

Joined Aug 18, 2015
98
The 23 amps is not the recommended power supply, it is the actual current draw at 100 watts TX. Even if it is 200w, it is still only 15 amps. There are still some amps to account for.
 

sagor

Joined Mar 10, 2019
797
The 23 amps is not the recommended power supply, it is the actual current draw at 100 watts TX. Even if it is 200w, it is still only 15 amps. There are still some amps to account for.
Re-read my last sentence in post #2. What is the idle current when transmitting wiith no power out? Add that to your numbers. Also, at 23A, is the power supply still 13.8V? I doubt it, as at that current you get voltage drops in the power cable, and the actual RF transistors see less voltage, perhaps even just 12V. Thus they have to draw more current to produce 100W output. The key is, what is the voltage at the power plug at the radio, not what the power supply meter says. You can easily drop a volt or two across the power cable between power supply and radio. Add to that, does the power supply itself remain at 13.8V? If not, you compound the voltage drops, and thus the radio needs more current.

Example:
Assuming a 14G stranded wire power cable, 10ft long between power supply and radio. Resistance counts both ways, red (positive) wires and black (ground/return) wire, you get a total resistance of about 0.054 ohms. At 23A, you get a voltage drop of almost 1.25V by the time it reaches the radio.
 
Last edited:

Thread Starter

PickyBiker

Joined Aug 18, 2015
98
I have asked my son to make the suggested measurements under minimum TX power. He will do that when he completes the new installation in his pickup truck. At home he had a 40A PS. In the truck he is planning on 10 gauge wire for power. I'll let you know what he finds.
 

drjohsmith

Joined Dec 13, 2021
577
Good bit of loose specs there..

13.8V +- 15 percent, is saying, its connected to a Lead acid battery.
TYPICAL 28 Amps Tx, 2 Amps RX.
Assuming thats at the -15 % , thats 12V,

Typical, thats saying , it needs a big cable to your supply,
it could be 5 Amps, it could be 40 amps, What does typical mean ?

Look at the other way,
at 28 Amps, at 13.8 + 15% is over 400 watts
if its putting 100 watts into the antenna, then its disipating 300 plus watts of heat,
A nice room warmer,
does it get hot ?
 
Top