# Amp & Milliamp Question

Discussion in 'General Electronics Chat' started by CodeWriter, Jan 15, 2014.

1. ### CodeWriter Thread Starter New Member

Jan 15, 2014
1
0
New to this forum, not sure if i've posted in the right place!

How do electronics work in the sense of converting/restricting amps. So for example, if I have a 4Amp power supply, but the device I want to use only requires 750mA then will the power supply only supply the correct amps to the device?

My understanding is that, as long as your power supply is capable of supplying the Amp power, and the voltage is correct, then the device will only draw the amount of A/mA that it needs, and the device will not be damaged in the process. Is that correct? So I do not require anything over than a voltage reduction circuit.

2. ### #12 Expert

Nov 30, 2010
16,705
7,358
Right. Your car battery is not going to force the headlights to have 450 amps of current because the light bulbs have resistance.

3. ### studiot AAC Fanatic!

Nov 9, 2007
5,005
515
You understand nearly correctly.

The voltage needs not only to be correct, it needs to be of the correct type ie AC or DC and connected in the correct polarity if DC.

A small point, but saying Amp power is not correct. Amperage is a measure of current, not power.
Power is measured in Watts, kilowatts etc.

Watts = Volts times Amps

4. ### russ_hensel Distinguished Member

Jan 11, 2009
820
47
There are 2 general types of ideal power supplies, constant current and constant voltage.

We usually think in terms of constant voltage supplies ( and add a series resistance to make it more real )

Simple devices relate current and voltage via ohms law.

Once a device is connected to a supply it draws a current/voltage so the voltage/current of both the supply and the device are simultaneous "solved" ( sometime this can even be an oscillation ).

Oct 15, 2009
4,793
976