I have a problem grasping Ampere usage... is ampere used "as needed" or is there some fixed value being pushed through?
suppose I have a 192 led array connected in parallel, each using 20ma.
the amount of "on" leds is unkonwn. it could be that only 1 is on so that the whole array is using 20ma or all the leds are on thus needing 3.84A.
I want to control the array, turning everything off, or allowing power through using an arduino. since an arduino pin can only source about 40ma - I would need a transistor capable of sourcing 4A.
my question is - what happens when only 1 led is turned on while the transistor is capable of sourcing 4A? will the ampere be used "as needed", just 20ma, or will things go boom?
another question - if the above setup is ok, is an amplifying transistor what I need? one with a gain of ~200 (considering I put 20-40ma in the base)? or maybe a different type?
thank.
suppose I have a 192 led array connected in parallel, each using 20ma.
the amount of "on" leds is unkonwn. it could be that only 1 is on so that the whole array is using 20ma or all the leds are on thus needing 3.84A.
I want to control the array, turning everything off, or allowing power through using an arduino. since an arduino pin can only source about 40ma - I would need a transistor capable of sourcing 4A.
my question is - what happens when only 1 led is turned on while the transistor is capable of sourcing 4A? will the ampere be used "as needed", just 20ma, or will things go boom?
another question - if the above setup is ok, is an amplifying transistor what I need? one with a gain of ~200 (considering I put 20-40ma in the base)? or maybe a different type?
thank.