I would like to know if the wattage of a incadecent bulb can be lower than what is printed on bulb?
What I meant was at 120 volts with a 75 watt bulb rating. When testing it comes up at 72 wattsThe bulb wattage is usually stated at a specified supply voltage for x bulb resistance at that voltage. If you lower the supply voltage the wattage usually will be lower but incandescent bulb resistance might not be linear over a large range of voltages.
https://www.fluke.com/en-us/learn/blog/energy-efficiency/say-goodbye-to-the-incandescent-lightbulb
The 75 watts is approximate. How are you measuring the power? Most bulbs I have checked using a small Kill-A-Watt have generally been +/- 1 watt of what is printed on the bulb for a new bulb, as a bulbs life diminishes that number can change. Anyway, an incandescent lamp is hardly a calibrated load so I do not see what you are seeing as unusual.What I meant was at 120 volts with a 75 watt bulb rating. When testing it comes up at 72 watts
I am building a multimeter, and to calibrate it I had to do current test. All I had was an older 75 watt bulb.The 75 watts is approximate. How are you measuring the power? Most bulbs I have checked using a small Kill-A-Watt have generally been +/- 1 watt of what is printed on the bulb for a new bulb, as a bulbs life diminishes that number can change. Anyway, an incandescent lamp is hardly a calibrated load so I do not see what you are seeing as unusual.
Ron
I am building a multimeter, and to calibrate it I had to do current test. All I had was an older 75 watt bulb.
So you are expecting it to be within 4%.What I meant was at 120 volts with a 75 watt bulb rating. When testing it comes up at 72 watts
The instructions wanted me to test AC current. This is why I used house current. I checked voltage first and it was 120.4 vac. To calibrate the amperage I have to adjust a jumper to get accurate amper rating, at this point the jumper tips are flush with the PC board, the highest I can get it. There are so many variables that I just want to make sure the meter is correct before I proceed to the next section of the assembly.Why are you using a bulb and house current to check your meter? There are much safer ways to accomplish that.
That's a lousy way to calibrate a meter. There are lots of simple and pretty cheap circuits you can make to calibrate a multimeter to a level that's more than sufficient for typical hobbyist use.I am building a multimeter, and to calibrate it I had to do current test. All I had was an older 75 watt bulb.
And how did you check the voltage? Using this multimeter you built?The instructions wanted me to test AC current. This is why I used house current. I checked voltage first and it was 120.4 vac.
I have to be completely honest, it's been more than 30 years since I've played with electronics, so basically I am starting out fresh. I've forgotten an awful lot of information. I know that there is a peak to peak etc. but I don't have an O. scope to check and I forgot the math to figure that part out. All I know is when I check with an already made meter the voltage is 120.4 AC and the meter I am putting together also reads that voltage as well, and when I test the current it comes up saying 0.60 amps on the meter I am making, the already made meter doesn't test for AC current only DC but it does measure AC voltage. The actual amount I need for 75 W bulb should be 0.625 give or take. I know it doesn't sound like much of a difference but I have OCD and I want to try and get it as spot on as I can.So you are expecting it to be within 4%.
Is your 120 V actually 120 Vrms? How do you know?
What voltage would be needed to come in at 72 W (assuming that the resistance of the filament is the same at both power levels, which it isn't but it probably isn't too different)? I get 117.6 V. That's only 2%.
Then you need to either get over the OCD enough to accept that you're not going to get it spot on (the best you would do is create the temporary illusion that you have) or you are going to need to step your game (and your equipment) up to an entirely different level.I have to be completely honest, it's been more than 30 years since I've played with electronics, so basically I am starting out fresh. I've forgotten an awful lot of information. I know that there is a peak to peak etc. but I don't have an O. scope to check and I forgot the math to figure that part out. All I know is when I check with an already made meter the voltage is 120.4 AC and the meter I am putting together also reads that voltage as well, and when I test the current it comes up saying 0.60 amps on the meter I am making, the already made meter doesn't test for AC current only DC but it does measure AC voltage. The actual amount I need for 75 W bulb should be 0.625 give or take. I know it doesn't sound like much of a difference but I have OCD and I want to try and get it as spot on as I can.
Do you think the 0.60 amps is good enough then? And thank you all for the input, it really helps.Then you need to either get over the OCD enough to accept that you're not going to get it spot on (the best you would do is create the temporary illusion that you have) or you are going to need to step your game (and your equipment) up to an entirely different level.
The meter you are using probably has a spec that is something like 1% of reading plus so many least significant digits. It's accuracy is also going to very from day to day, particularly with things like temperature. The same will be true of the meter you are building.
Do you think the 0.60 amps is good enough then? And thank you all for the input, it really helps.
by Duane Benson
by Aaron Carman
by Jake Hertz