heated gloves

Thread Starter

diogenes

Joined Feb 17, 2009
10
Hi, I'm looking to make some heated gloves. Basically I plan on running current through a wire of some resistant material that I've attached to the inside of a glove/mitten. So what I need advice first on is if I there is a special material I should use for the wire, or can I just use regular copper wire of the right diameter? (I think I read somewhere that some commercially available gloves just use copper wire.) The gloves don't have to be portable at all.

The other question I have is if there are any relatively simple equations I could use to ballpark this design ahead of time.

If I use Ohm's law R=V/I
and if R = resistivity of wire material x (Length/cross section area)
then what can I use to equate R with the temperature of the wire?

Also, would either AC or DC power offer any advantages over the other?
 

mik3

Joined Feb 4, 2008
4,843
It would be better to use a material with higher resistivity than copper as to require less current to heat it up.

Also, I suggest you to power the thing with a rechargeable battery because if you use AC and a step down transformer and a fault occurs what will happen?

Think of it.
 

thingmaker3

Joined May 16, 2005
5,083
The gloves don't have to be portable at all.
I suppose you could glue them to a baseboard heater...

The place you need to start with is how much power the gloves will need to dissipate as heat. This will require knowing the insulation value of the gloves and the temperature of the environment the gloves will operate in.

Once you know the resistive power to be dissipated, it will be quite simple to derive the rest.
 

Thread Starter

diogenes

Joined Feb 17, 2009
10
Thanks for the replies. Thingmaker3, gluing the gloves to the heater would simplify things I guess lol. Or I could just hold them in front of a space heater.

Kmoffett, thanks for the link. As you can see though there is pretty much two price areas: the 20 dollar ones which have horrible reviews and don't even have heating elements in the fingers, and the $150 and up variety, which is more than I want to spend. Plus if I bought it I wouldn't have all the fun of making my own and getting electrocuted.

Too simplify my question, lets just say I wanted to heat a 6 foot copper wire (only surrounded by room temp air) so that it is very warm, but not burning red hot like a toaster. What equation could I use to determine the power needed to do this for a particular gauge of wire.
 

Thread Starter

diogenes

Joined Feb 17, 2009
10
Great link, thanks! Looks like there's some good info on the rest of that website as well.

I ended up getting a few different sizes of nichrome-60 wire on ebay (22, 24, 26 gauge) in about 50 foot lengths. That will allow me to experiment a little to see what works. I also need to see how brittle and flexible the wire is. Thanks again for your help.
 

Thread Starter

diogenes

Joined Feb 17, 2009
10
I am trying to ballpark the voltage necessary for 110 degrees in 2 meters of 26 AWG nichrome wire. I'm a novice at electronics but want to learn, so I'm wondering if I'm doing this right:

I used the slope of the line from the temp chart to get a figure of around 0.4 amps for 110 degrees.

then I used Ohm's law with the resistivity equation to get
V = Ipl/A where p is resistivity, l length, A cross section
so V = (0.4)(1E-6)(2)/(pi*((0.0004/2)^2))
I end up with about V = 6.5 volts

So if I take 5 1.5v batteries, and wire them in series with a variable resistor and 2 meters of the nichrome wire, would that be a good starting point to experiment with this?
 

thingmaker3

Joined May 16, 2005
5,083
If we're talking about a table-top experiment, then yes. Once you are certain of controlling the heat so as not to burn yourself or start a fire, you can explore how much warmer the wire will get inside thermal insulation.
 
Top