In an earlier thread I came to the conclusion that I had to revise the plan for helping my son with his 5th grade science project. The objective is to demonstrate how Ohm's law plays a role in circuit design by preventing damage to sensitive components with the use of current limiting resistance.
Our original plan was to calculate resistor values to drive an LED with nominal current, extremely low current, and extremely high current. After a few days of experimenting and posting results on this forum it became apparent that due to the non-linear nature of LEDs we needed to change course.
Instead we are now going to try burning out a low power 1/8 watt resistor which will play the role of the "sensitive" load. The plan is to use decreasing R values of high power current limiting resistors connected in series to the low power load resistor until it burns out.
A few hours ago I went out and purchased some high powered resistors that will act as current limiters and some low power load resistors.
To start with simple matters first, I measured the actual resistance of the 5.1Ω resistor with DMM1. It was exactly 5.1Ω.
I then measured the voltages for 5 of my 6 power supply presets by directly hooking up the leads to DMM1. Those values were as follows (rounded to two decimal points) :
3V preset - 3.00V
4.5V preset - 4.47V
6V preset - 5.94V
7.5V preset - 7.45V
9V preset - 8.98V
I then connected just the 5.1Ω resistor into a breadboard and added two jumper wires for the power supply grabber clips. I also connected the leads of DMM1 across the resistors leads so I can measure the voltage drop for the resistor at the different power supply presets.
The following are measurements of voltage drop across the 5.1 Ω resistor (rounded to two decimal points) for those 5 presets. These were taken with no ammeter connected to the circuit.
2.87V
4.29V
5.69V
7.16V
8.62V
1) Considering nothing else was connected to this circuit is this "difference" between voltage drop and voltage supply expected at such a low R value?
Continuing on I then connected DMM2 as an ammeter between the resistor and the negative terminal of the power supply and repeated the voltage drop measurements with DMM1 and included the corresponding current measurements with DMM2. Note: it was necessary for me to move my lead on DMM2 to the 10A scale (and the dial as well).
DMM1 = 2.805V DMM2 = 0.55 A
DMM1 = 4.18V DMM2 = 0.82 A
DMM1 = 5.56V DMM2 = 1.10 A
DMM1 = 7.0V DMM2 = 1.38 A
DMM1 = 8.43V DMM2 = 1.66 A
2) Is this change in voltage drop across the same resistor due to the ammeter?
I then replaced the 5.1Ω 10W resistor with a 100Ω 10W resistor and the voltage drops across the resistor matched up perfectly with the measured power supply preset values.
3) Does this make sense? Why did I not get the same voltage drops with the 5.1Ω resistor?
Our original plan was to calculate resistor values to drive an LED with nominal current, extremely low current, and extremely high current. After a few days of experimenting and posting results on this forum it became apparent that due to the non-linear nature of LEDs we needed to change course.
Instead we are now going to try burning out a low power 1/8 watt resistor which will play the role of the "sensitive" load. The plan is to use decreasing R values of high power current limiting resistors connected in series to the low power load resistor until it burns out.
A few hours ago I went out and purchased some high powered resistors that will act as current limiters and some low power load resistors.
To start with simple matters first, I measured the actual resistance of the 5.1Ω resistor with DMM1. It was exactly 5.1Ω.
I then measured the voltages for 5 of my 6 power supply presets by directly hooking up the leads to DMM1. Those values were as follows (rounded to two decimal points) :
3V preset - 3.00V
4.5V preset - 4.47V
6V preset - 5.94V
7.5V preset - 7.45V
9V preset - 8.98V
I then connected just the 5.1Ω resistor into a breadboard and added two jumper wires for the power supply grabber clips. I also connected the leads of DMM1 across the resistors leads so I can measure the voltage drop for the resistor at the different power supply presets.
The following are measurements of voltage drop across the 5.1 Ω resistor (rounded to two decimal points) for those 5 presets. These were taken with no ammeter connected to the circuit.
2.87V
4.29V
5.69V
7.16V
8.62V
1) Considering nothing else was connected to this circuit is this "difference" between voltage drop and voltage supply expected at such a low R value?
Continuing on I then connected DMM2 as an ammeter between the resistor and the negative terminal of the power supply and repeated the voltage drop measurements with DMM1 and included the corresponding current measurements with DMM2. Note: it was necessary for me to move my lead on DMM2 to the 10A scale (and the dial as well).
DMM1 = 2.805V DMM2 = 0.55 A
DMM1 = 4.18V DMM2 = 0.82 A
DMM1 = 5.56V DMM2 = 1.10 A
DMM1 = 7.0V DMM2 = 1.38 A
DMM1 = 8.43V DMM2 = 1.66 A
2) Is this change in voltage drop across the same resistor due to the ammeter?
I then replaced the 5.1Ω 10W resistor with a 100Ω 10W resistor and the voltage drops across the resistor matched up perfectly with the measured power supply preset values.
3) Does this make sense? Why did I not get the same voltage drops with the 5.1Ω resistor?
Last edited: