Just tested my twin secondary transformer.
The secondaries are rated at 18v but test at 20.3v each.
Don't know why?
I'm sure it's a simple answer, but not to me!
Why?
They are rated for 18v at a certain current. If you take the expected voltage reading and divide that by the rated current, you will wind up with the resistance in Ohms that you need to use to place the rated load on the transformer.
For example: 18vac @ 100mA:
R= 18vac / 100mA = 18/0.1 = 180 Ohms.
Then you calculate the wattage requirement:
Power in Watts = EI (voltage x current)
P = 18*0.1
P = 1.8 Watts.
However, we use a higher rating for reliability; 160% is about the least rating you would want to use.
1.8*160% = 2.88 Watts - you would use the next higher rated resistor, or 3 Watts.