Question About Making Heat with Electricity

Thread Starter

Benjamin0904

Joined Oct 3, 2020
32
Hello everyone!
Just recently I am attempting to create heat using electricity. I have very minimal input power which is likely the problem.
I am starting with no more than 3.7v DC and 100mah, and I need a final temperature of at least 1500'F
I agree that this sounds like a mighty task, but I feel like there is some way.
I found something online called an arc lighter. It uses plasma to light things on fire and requires minimal voltage, but amperage I'm not sure about.
Here is the link with specs for the Arc lighter at Walmart:
Walmart Arc Lighter

Most Arc lighter brands claim that, "With a 220mAh battery capacity, you can get about 60-100 lights per charge."

So I'm almost positive that the battery they use is a 3.7v 220mah lipo battery.
The Arc lighter above definitely has the temperature needed with full power input, but I'm wondering about this:
How much amperage does the plasma load draw from the rechargeable battery? I'm trying to figure this out. If I could find out how many milliamps plasma draws then I could possibly step up my power supply.

I assumed that if an average light lasts 5 seconds, multiply that by 80 lights and you have 6 minutes of battery life. Although that doesn't sound right

Thanks-Please let me know what you think,
Benjamin
 

Thread Starter

Benjamin0904

Joined Oct 3, 2020
32
I understand that numerous projects on the internet use a 3.7v 220mah battery. Although that doesn't mean that the plasma/load is drawing all that 220mah at once. I am trying to figure out what current the plasma will draw from the battery. Basically, the minimum needed power to create plasma.
Thanks.
 

Papabravo

Joined Feb 24, 2006
21,159
There are a couple of problems.
mah (milliamp hours) is a measure of battery capacity. It is useful for making a calculations about how long a battery will provide a given amount of current, measured in amperes, to a load. In order to use a battery to heat something up you need to know the resistance. Nichrome (a Nickel-chrome alloy) wire is often used for this application. The temperature rise in a load, like a resistor or a length of Nichrome wire can be determined by how much power is delivered to that load. Power is measured in watts, which can be computed as the square of the current times the resistance.

Check out this table which will give an amount of current required to achieve a specific temperature. I don't think you battery will have enough current to get you to your desired temperature. If it could supply that amount of current for a very short period (microseconds) it would become discharged to fast to have much observable effect.

https://en.wikipedia.org/wiki/Nichrome
 

WBahn

Joined Mar 31, 2012
29,978
Hello everyone!
Just recently I am attempting to create heat using electricity. I have very minimal input power which is likely the problem.
I am starting with no more than 3.7v DC and 100mah, and I need a final temperature of at least 1500'F
I agree that this sounds like a mighty task, but I feel like there is some way.
I found something online called an arc lighter. It uses plasma to light things on fire and requires minimal voltage, but amperage I'm not sure about.
Here is the link with specs for the Arc lighter at Walmart:
Walmart Arc Lighter

Most Arc lighter brands claim that, "With a 220mAh battery capacity, you can get about 60-100 lights per charge."

So I'm almost positive that the battery they use is a 3.7v 220mah lipo battery.
The Arc lighter above definitely has the temperature needed with full power input, but I'm wondering about this:
How much amperage does the plasma load draw from the rechargeable battery? I'm trying to figure this out. If I could find out how many milliamps plasma draws then I could possibly step up my power supply.

I assumed that if an average light lasts 5 seconds, multiply that by 80 lights and you have 6 minutes of battery life. Although that doesn't sound right

Thanks-Please let me know what you think,
Benjamin
You seem to be confusing current (measuring in mA, for instance) and capacity (measured in mAh, for instance). These are two completely different things.

You needs are also extremely ill defined. The final temperature of WHAT needs to be at least 1500 °F? How fast? For how long? In what environment? It's one thing to bring a very small region to a temperature of a couple thousand degrees Fahrenheit using very little energy with a plasma lighter in order to ignite something else; it's a very different matter to use that plasma to directly heat much of anything to anywhere close to that temperature.

So please explain what the actual problem is that you are trying to solve -- perhaps there's a much better way to tackle it.
 

nsaspook

Joined Aug 27, 2009
13,086
You can generate pin-point high temperature with plasma but what you need to consider is the substantial power density needed for more than firing up a cigar.
A several thousand watt industrial plasma source can melt large solid plates of Molybdenum like butter.



Happened inside a ^-6 Torr vacuum chamber.
 

jpanhalt

Joined Jan 18, 2008
11,087
Hello everyone!
Just recently I am attempting to create heat using electricity. I have very minimal input power which is likely the problem.
I am starting with no more than 3.7v DC and 100mah, and I need a final temperature of at least 1500'F
I agree that this sounds like a mighty task, but I feel like there is some way.
I found something online called an arc lighter. It uses plasma to light things on fire and requires minimal voltage, but amperage I'm not sure about.
Here is the link with specs for the Arc lighter at Walmart:
Walmart Arc Lighter

Most Arc lighter brands claim that, "With a 220mAh battery capacity, you can get about 60-100 lights per charge."

So I'm almost positive that the battery they use is a 3.7v 220mah lipo battery.
The Arc lighter above definitely has the temperature needed with full power input, but I'm wondering about this:
How much amperage does the plasma load draw from the rechargeable battery? I'm trying to figure this out. If I could find out how many milliamps plasma draws then I could possibly step up my power supply.

I assumed that if an average light lasts 5 seconds, multiply that by 80 lights and you have 6 minutes of battery life. Although that doesn't sound right

Thanks-Please let me know what you think,
Benjamin
Have you considered a miniature, low-voltage light bulb like this: https://sciencekitstore.com/subminiature-light-bulb-e5-5-2-5v-200-ma/

The filament temperature when at full brightness can be on the order of 4000°K. Of course, lowering the voltage, adding a series resistor, or using a bulb rated for higher voltage (e.g., 12V to 28V) will allow you to get a lower temperature. (Of course, much less light too.) "Grain of wheat" bulbs operate at 12V to 28V.

Assuming you have 100 mA (not mAh) available, then your wattage is 0.37 W. In order to get a high temperature with a small amount of power, you need to decrease the rate at which the heat produced is conducted away. A small mass helps. Putting it in a vacuum will also help.
 

Thread Starter

Benjamin0904

Joined Oct 3, 2020
32
You seem to be confusing current (measuring in mA, for instance) and capacity (measured in mAh, for instance). These are two completely different things.

You needs are also extremely ill defined. The final temperature of WHAT needs to be at least 1500 °F? How fast? For how long? In what environment? It's one thing to bring a very small region to a temperature of a couple thousand degrees Fahrenheit using very little energy with a plasma lighter in order to ignite something else; it's a very different matter to use that plasma to directly heat much of anything to anywhere close to that temperature.

So please explain what the actual problem is that you are trying to solve -- perhaps there's a much better way to tackle it.
Sorry about the bad explanation. - I am trying to heat an 8cc block of steel to 1500'F. It can take as long as it takes to warm up. I want the plasma on and the steel heated as long as their is power input. So when I remove the power input, the plasma turns off, and the metal cools down.
Thanks.
 

jpanhalt

Joined Jan 18, 2008
11,087
A 2-cm cube is a fairly good sized chunk. How are you insulating it from heat loss? How much heat (power) are you able to add to it? My guess from your original post was about 0.4W, which will likely not get it to the temperature you want in free air.

For example, consider the smaller TO-220 transistor. In free air, it dissipates heat such that its temperature rise is 70°C/W (Wikipedia). Thus, if it is dissipating 0.4W, it will only rise to a temperature of about 28°C above ambient. I suspect there are tables available for heat dissipation by steel.

What we need to know is how much heat/power you will use for heating, and how will you protect from heat loss?
 

djsfantasi

Joined Apr 11, 2010
9,156
Sorry about the bad explanation. - I am trying to heat an 8cc block of steel to 1500'F. It can take as long as it takes to warm up. I want the plasma on and the steel heated as long as their is power input. So when I remove the power input, the plasma turns off, and the metal cools down.
Thanks.
Melt an 8cc steel block with a 3.7V 100mAh battery? Do a reality check...
 

Alec_t

Joined Sep 17, 2013
14,280
Let's run some numbers.
A 3.7V 100mAh battery has an energy capacity of (theoretically) 0.37Wh = 0.37 x 3600 J = 1332J
The mass of 8cc of steel is about 64g.
The specific heat of steel is 420J/kg°C.
1500°F ≅ 800°C.
To heat that steel block by 800°C thus requires 64 x 0.42 x 800 = 21504J.

As you can see, the battery holds nowhere near enough energy to heat the steel to the required temperature, even assuming no heat loss to the environment.
 

Chris65536

Joined Nov 11, 2019
270
Sorry about the bad explanation. - I am trying to heat an 8cc block of steel to 1500'F. It can take as long as it takes to warm up. I want the plasma on and the steel heated as long as their is power input. So when I remove the power input, the plasma turns off, and the metal cools down.
0.4W of plasma won't do any better than 0.4W of resistance heating. With no heat losses, I calculate your ~63g steel cube will take over 15 hours to reach 1500F.
 

jpanhalt

Joined Jan 18, 2008
11,087
I believe those calculations are for adiabatic heating. They ignore heat losses, which the TS hasn't addressed, and heat of fusion (additional heat for the phase transition from solid to liquid or visa versa -- the direction of change can have an effect). Couldn't find "steel" as it varies with alloy, but a value of about 242 kJ/kg for iron is in the literature.
 

WBahn

Joined Mar 31, 2012
29,978
Sorry about the bad explanation. - I am trying to heat an 8cc block of steel to 1500'F. It can take as long as it takes to warm up. I want the plasma on and the steel heated as long as their is power input. So when I remove the power input, the plasma turns off, and the metal cools down.
Thanks.
This is what I suspected -- you are not even in the realm of the physically possible. Let's do the math assuming ideal conditions (and you won't be anywhere close to them).

Depending on the kind of steel you are talking about, the specific heat will be somewhere in the range of 500 J/(kg·K), meaning that it will take 500 J of energy to raise the temperature of one kilogram of steel by one kelvin (which is the same temperature rise as one degree celsius). Since the melting point of most steels is in the roughly 2500 °F range, there's no concern about having to deal with the heat of fusion to cause the phase transformation from solid to liquid.

The density of steel, which again varies with the specific kind you are talking about, is roughly 8 g/cc, so 8 cc would be 64 g.

You are wanting to raise the temperature from (assuming) room temperature of around 70 °F to 1500 °F, a change of 1430 °F which is a change of 794 K.

So the energy you need -- assuming no loses of any kind -- would be

E_needed = (64 g)(500 J/(kg·K))(794 K) = 25.4 kJ

Now let's consider how much energy is in that battery.

E_battery = (3.7 V)(100 mAh) = 370 mWh = 1.33 kJ

As you can see, even if every joule available went into heating the block of steel, you are about a factor of 20 too small in having the energy needed.

So give up this idea and start looking for something else, because this dog ain't gonna hunt.
 

WBahn

Joined Mar 31, 2012
29,978
I believe those calculations are for adiabatic heating. They ignore heat losses, which the TS hasn't addressed, and heat of fusion (additional heat for the phase transition from solid to liquid or visa versa -- the direction of change can have an effect). Couldn't find "steel" as it varies with alloy, but a value of about 242 kJ/kg for iron is in the literature.
What does the heat of fusion have to do with anything? The TS is trying to heat the steel to 1500 °F, which is WAY below the melting point of any steel (which tend to be in the 2500 °F range).
 
Top