Resistors seem a little counter intuitive to me but I understand how they work fine.
For example.
10vdc you wish to supply 100ma to the circuit, 10v/0.100= 100 ohms pretty straight forward, heat dissipation becomes 10v * 0.100 = 1watt you'd need a 1watt resistor or you'd risk burning it up.
But I don't see how that's the "resistor" resisting the voltage by turning the excess into heat, if you put 100ma through a wire, or anything conductive it's going to use create just as much amount of heat
So let's suppose that 100ma flowing through the circuit, if you stick a 1watt component rated at 10v into the equation, the resistor then should not produce any heat because it's no longer having to deal with 1watt of energy? I mean sticking a 10k resistor and next to no heat is produced and no way is that 10k resistor turning it into heat it would be cold as ice and supplying a little amount of energy....
(Resistors don't turn excess into heat unless nothing is there to eat up the energy?)
For example.
10vdc you wish to supply 100ma to the circuit, 10v/0.100= 100 ohms pretty straight forward, heat dissipation becomes 10v * 0.100 = 1watt you'd need a 1watt resistor or you'd risk burning it up.
But I don't see how that's the "resistor" resisting the voltage by turning the excess into heat, if you put 100ma through a wire, or anything conductive it's going to use create just as much amount of heat
So let's suppose that 100ma flowing through the circuit, if you stick a 1watt component rated at 10v into the equation, the resistor then should not produce any heat because it's no longer having to deal with 1watt of energy? I mean sticking a 10k resistor and next to no heat is produced and no way is that 10k resistor turning it into heat it would be cold as ice and supplying a little amount of energy....
(Resistors don't turn excess into heat unless nothing is there to eat up the energy?)