Finding the minimum number of rated watts in a series resistor

Discussion in 'Homework Help' started by trap_lord, Nov 11, 2015.

  1. trap_lord

    Thread Starter New Member

    Nov 11, 2015
    Hello. I have a homework question and I'm not quite sure how to solve it. It asks to find what the minimum number of rated watts a series resistor must be if it has 48 volts dropped across it and has a resistance of 5 ohms. I assumed your supposed to use Ohm's law, and I got 460.8 watts as the minimum number of rated watts. Is this correct?
  2. StayatHomeElectronics

    AAC Fanatic!

    Sep 25, 2008
    You have correctly used P = I * V to find the actual wattage of the part. I always think of rated Watts as what is the rating on a part that you could actually buy. Some wattage rating above what the actual part will dissapate.

    Now, I am much more familiar with 1, 1/2, 1/4, and 1/8 W parts and do not know what is available at the levels that you have calculated.

    That is at least the first thing that comes to mind when I read your question.
  3. trap_lord

    Thread Starter New Member

    Nov 11, 2015
    The question does not state what parts are available. I guess it's just asking in theory what could a resistor this size could handle, I guess.
    Thanks for your help.
  4. WBahn


    Mar 31, 2012
    I think your guess is reasonable and probably correct. The course you are taking may or may never talk about actual available ratings. Hopefully the do at least emphasize that you can use any resistor with a rating at least this much and hopefully they also mention that, in practice, you want to build in some overhead. In other words, if you needed a 4.97 W resistor, don't get a 5 W resistor, but rather get a 7.5 W or a 10 W. In general, unless there's a damn good reason (and damn good reasons DO exist), don't push the performance specs.
  5. trap_lord

    Thread Starter New Member

    Nov 11, 2015
    Well, for this specific question it does not state the resistor parts. In the following questions they deal with 1/2 and 1/4 resistors, however it is asking to find whether the resistor would burn up when x number of volts is dropped and y amps are running through it. Doesn't look like we deal with any resistors bigger than those, though.
    Thanks for your input.
  6. MrAl

    AAC Fanatic!

    Jun 17, 2014

    When a question like this comes up you often have to refer to your lecture notes or what you learned in class or what examples are shown in your book.

    For this example you have to find the 'minimum' rated watts for a resistor of known resistance and known voltage across it.

    The way you have to interpret the word, "minimum", will be based on what you learned from the course up to this point.

    If you *never* learned that the resistor must be over rated in order not to burn too hot (they can get VERY hot and still not be destroyed when operated near rated power) then you probably go with the calculated value. But if you missed that lecture then you have to try to get someone elses lecture notes. If your book shows an example then it might show an over rating factor of say 2, so if you calculated 10 watts you use 20 watts. But if you cant find anything like this, then you probably go with the calculated value as the min.

    I've measured some power resistor temperatures while being energized and even before you get up to the manufacturers rated value the resistor can get as hot as 270 degrees C. That's pretty hot, and if you touched it it would burn your finger. But in the absence of an example somewhere that tells you about this kind of thing, then you probably go with the raw calculation. An example would be 10 volts and 10 ohms would require a resistor power rating of 10 watts as a minimum.