Hey all,
I'm still just learning a lot about electronics, but I wanted to build something already, so I am wanting to try some simple circuits in the hope I'll be able to build something useful.
I started with a simple voltage divider circuit, but I did not get the results I was expecting, so I narrowed the circuit down to 2 resistors I had (both 325 Ohm), and connected them in series, to a source of 5V. Then I got my multimeter, and touched one probe to the positive side of the source, and the other probe to the negative side of the source.
What I don't understand is that the multimeter reads about 0.8V, (and 0.4V when touching one probe in between te resistors, which is to be expected when the voltage (drop) across both resistors is 0.8V) but I thought that the voltage drop across the whole circuit (in this case consisting out of only 2 resistors) would be 5V.
When I touch my probes to the voltage source, and this source is not connected to anything at all, the multimeter reads out about 4.9V (which I guess is either because the multimeter is not calibrated accurately enough, or the power supply may just be 100mV less than stated. Anyway, the multimeter seems to work correctly when 'connecting' the meter itself in series with the voltage source, BUT, not when connecting the meter in parallel with some resistor components.
Can anybody guide me into the right direction, or explain whats going on and why it is not what I am expected to see? Or is it maybe possible that something is wrong with either my multimeter or power supply. If you want to know, I took an USB-B receptor, and soldered two wires to the +5V and the ground pins, and every time I want to test a circuit I connect it to an USB port on my computer. Anyway, I think it is rather the lack of my knowledge in electronics than some faulty hardware that may be the cause of problems here.
Thanks for reading this shitload of text. ^^
I'm still just learning a lot about electronics, but I wanted to build something already, so I am wanting to try some simple circuits in the hope I'll be able to build something useful.
I started with a simple voltage divider circuit, but I did not get the results I was expecting, so I narrowed the circuit down to 2 resistors I had (both 325 Ohm), and connected them in series, to a source of 5V. Then I got my multimeter, and touched one probe to the positive side of the source, and the other probe to the negative side of the source.
What I don't understand is that the multimeter reads about 0.8V, (and 0.4V when touching one probe in between te resistors, which is to be expected when the voltage (drop) across both resistors is 0.8V) but I thought that the voltage drop across the whole circuit (in this case consisting out of only 2 resistors) would be 5V.
When I touch my probes to the voltage source, and this source is not connected to anything at all, the multimeter reads out about 4.9V (which I guess is either because the multimeter is not calibrated accurately enough, or the power supply may just be 100mV less than stated. Anyway, the multimeter seems to work correctly when 'connecting' the meter itself in series with the voltage source, BUT, not when connecting the meter in parallel with some resistor components.
Can anybody guide me into the right direction, or explain whats going on and why it is not what I am expected to see? Or is it maybe possible that something is wrong with either my multimeter or power supply. If you want to know, I took an USB-B receptor, and soldered two wires to the +5V and the ground pins, and every time I want to test a circuit I connect it to an USB port on my computer. Anyway, I think it is rather the lack of my knowledge in electronics than some faulty hardware that may be the cause of problems here.
Thanks for reading this shitload of text. ^^