Can anyone please tell me what the difference is between the following two sources:
a) 10vdc @ 1A
b) 10,000vdc 0.001A
If we compare the two sources with each other, in the first example the voltage is 1000 times less, and the amperage is 1000 greater.
In the second example the voltage is 1000 times greater, and the amperage is 1000 less.
So if we use the formula W = V x A for the above two example sources:
10v x 1A = 10 watts
10,000v x 0.001A = 10 watts
Is the total power output the same for these two sources?
And if so, why is it that example b) would give you a perceivable nasty shock, whereas for example a) you'd probably not even notice anything?
a) 10vdc @ 1A
b) 10,000vdc 0.001A
If we compare the two sources with each other, in the first example the voltage is 1000 times less, and the amperage is 1000 greater.
In the second example the voltage is 1000 times greater, and the amperage is 1000 less.
So if we use the formula W = V x A for the above two example sources:
10v x 1A = 10 watts
10,000v x 0.001A = 10 watts
Is the total power output the same for these two sources?
And if so, why is it that example b) would give you a perceivable nasty shock, whereas for example a) you'd probably not even notice anything?