I understand that from Ohm's Law Current I = Voltage (V)/ Resistance (R)....and it says that the current (amp) provided by the ratio of voltage over resistance. And how much amp is being output is depending the load...
Now the amp output depending on the load ...this one i quite don't understand --- for instance just take a common example of a smart phone. If the amp is depending on how much needed by the smart phone battery why then the charger have various amperage rating. Some are 1 amp while others maybe 1.5 amp or 2 amp....
why the charger need to specify the amp when the phone only will take the required amp it needs? anyone would you mind to advise my shallow knowledge of this..
Why not use 5 to 6 volt battery direct connect to the phone...without any circuit will the phone take just enough amp it needs and will not exceed or under charge
Now the amp output depending on the load ...this one i quite don't understand --- for instance just take a common example of a smart phone. If the amp is depending on how much needed by the smart phone battery why then the charger have various amperage rating. Some are 1 amp while others maybe 1.5 amp or 2 amp....
why the charger need to specify the amp when the phone only will take the required amp it needs? anyone would you mind to advise my shallow knowledge of this..
Why not use 5 to 6 volt battery direct connect to the phone...without any circuit will the phone take just enough amp it needs and will not exceed or under charge
Last edited: