I think I have been reading to much lately as I have lost the understanding of basic voltage v current.
The question I need answered is:
If you was to use a step up transformer, obviously the voltage goes up as the current goes down. But say the starting values were 6V at 1A, how low could you take the current before it stops working?
I know this is basic as anything, but I have lost it completely and can't seem to remember what is going on. I know there is a limit to how low the current can go, cus I know you can't get say 500KV from the above mentioned values lol.
Just to give you a bit about me, I understand ( or used to! ) how electricty works, I have built many simple and intermediate circuits.
So please help me before I think drinking guinness has killed a few to many brain cells
The question I need answered is:
If you was to use a step up transformer, obviously the voltage goes up as the current goes down. But say the starting values were 6V at 1A, how low could you take the current before it stops working?
I know this is basic as anything, but I have lost it completely and can't seem to remember what is going on. I know there is a limit to how low the current can go, cus I know you can't get say 500KV from the above mentioned values lol.
Just to give you a bit about me, I understand ( or used to! ) how electricty works, I have built many simple and intermediate circuits.
So please help me before I think drinking guinness has killed a few to many brain cells