I have a Commercial Electric DMM (Model HDM350). I think I bought it at Home Depot a long time ago. I don't know a great deal about meters, in fact the only thing I've ever used it for is to make sure I have 120V coming out of an AC outlet.
In the process of working on my garage door project, I came up with some unexpected readings. I don't have an app that draws circuit diagrams so I will just explain it because its simple:
12V non-regulated wallwart -> positive lead connects to 1K resistor -> other side of resistor connects to anode of 1st LED -> cathode of 1st LED connects to anode of 2nd LED -> cathode of 2nd LED connects to negative lead of the wallwart.
When I took the LEDs out of the equation and measured the voltage across the positive and negative leads of the Wallwart (with the 1K resistor soldered to the positive lead, I got 15.6V).
When I added the LEDs back in, it read 3.45V (using the 20V DC scale).
When I broke the circuit and added the meter in between the 1K resistor and the anode of the 1st LED to measure the current, it read 0.06 (using the 20m scale).
Since the typical voltage of my LEDs is 2.25V (max is 2.6V), the following current should have been measured:
(15.6-2.25-2.25)/1000 = 11.1 mA
Is my meter really off, or am I reading the current wrong?
Also, where does the 3.45V come from?
In the process of working on my garage door project, I came up with some unexpected readings. I don't have an app that draws circuit diagrams so I will just explain it because its simple:
12V non-regulated wallwart -> positive lead connects to 1K resistor -> other side of resistor connects to anode of 1st LED -> cathode of 1st LED connects to anode of 2nd LED -> cathode of 2nd LED connects to negative lead of the wallwart.
When I took the LEDs out of the equation and measured the voltage across the positive and negative leads of the Wallwart (with the 1K resistor soldered to the positive lead, I got 15.6V).
When I added the LEDs back in, it read 3.45V (using the 20V DC scale).
When I broke the circuit and added the meter in between the 1K resistor and the anode of the 1st LED to measure the current, it read 0.06 (using the 20m scale).
Since the typical voltage of my LEDs is 2.25V (max is 2.6V), the following current should have been measured:
(15.6-2.25-2.25)/1000 = 11.1 mA
Is my meter really off, or am I reading the current wrong?
Also, where does the 3.45V come from?