Great question. I always thought that since W(watts) = E(volts) X I(amps), that it was only some peculiararity of electricians that caused them to want to use volt-amps instead of watts. Unless someone can shed some light, I think there's no difference in the measurements.
i believe the purpose of using watt to rate a transformer is to inform the user that he can only connect loads up to the rate they say. if they say 500w then you can only connect circuits that require up to 500w only. if your device needs 1000w then surely your transfo will not be able to supply and gets overloaded.
VA = Power output / efficiency
VA = 500w / 0.9 * 0.9
VA = 617