Controlling Microwave Oven Power

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Hello there,

I FINALLY got around to doing this experiment. I was planning to try it for years now and even talked about it now and then on sites like this one.

For those that know the Panasonic brand microwave ovens, some of them have what they call "Inverter Technology" and that allows the oven power to be controlled in a linear non pulsed way. The inverter inside the oven lowers the power level when you select a lower power level. Unlike most more conventional ovens, it does not PULSE the oven ON and OFF to achieve the lower power settings, it just simply lowers the power getting to the magnetron, and that is what changes the power level.

In the past however, i had more conventional microwave ovens, that pulsed the power. But one thing i noticed was that when the line voltage went low. the oven would take longer to cook something like a piece of chicken. Needless to say, i thought that maybe varying the input voltage to a regular microwave oven would allow me to change the power levels without having the microwave pulse on and off.

Well as it turns out, i got to try it yesterday. Turning the line voltage down with a variac allows the power level of the oven to go down and thus cook the food more evenly. It's a very sensitive adjustment though because you reach a point where just changing the voltage a little makes the input power go down quite a bit, so the adjustment really has to be made while monitoring the current or power into the oven. It worked pretty nice though.

I dont have any measurements yet except for input power and i can say that at one setting i got an oven that normally draws 1100 watts input power to draw just 600 watts, and cook more slowly, at an input voltage of around 90 volts AC when the normal line voltage is around 120 volts AC.

The reason for doing this is twofold.
First, the line voltage drops significantly around here for loads as high as 10 amps, so turning the current down to say 5 amps would be much better.
Second and also important though, the food cooks more evenly and comes out more tender. I believe that is because the food never gets banged with the full power of the oven every time it turns on, and that tends to dry out the outer surface of the meat and sometimes inside too.

So if you are curious about how well this works, give it a try. You'll need a decent variac however, one that can handle the full input current of your microwave oven.
 

MSFTF

Joined Aug 11, 2017
33
I found your experiment interesting because I tried doing something but with other various voltage dropping methods. I saw a similar nonlinear input voltage to output power relationship. Much of the reason is because the heater filament could use more constant voltage while the high voltage gets lowered a bit instead of both together. But anything like that would need to be done very very carefully.

What I ended up doing was go back to focusing on power pulsing like how you were mentioning at first. The oven plugs into something that now can pulse the power until the mechanical timer times out after about six hours. The on-time and off-time are separately adjustable. Cooking is greatly improved. It can do more like slow cooking now and simmer food more efficiently.
 

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Hi,

That sounds interesting too. I guess your oven did not have a power control setting of any kind then? Or did you need more finer control, which i also found for some things.

Yeah, the voltage adjustment is very picky so you really need a variac. Ultimately the best control would be had with a voltage regulated AC source, such as an inverter, so you can set the exact voltage you want and then leave it. As my power line voltage varies i get a little variation on the power setting, so the variac does not offer regulation just voltage lowering. It's also necessary to monitor the input power or current.

I just tried another experiment a little while ago by turning the total input power down to 400 watts. Now this oven is normally going to draw 1100 watts. so you would think only allowing 400 watts into the oven plug would not cook anything. But i just cooked a hamburger that way in about 5 minutes or so. Not bad really.
 

MSFTF

Joined Aug 11, 2017
33
Hi, my oven had defrost, low, medium, and high. But what happened was operation on a factory setting didn't cook the way I needed, and I first went to manual control of the magnetron by a manually operated switch. Another switch turned on the general power to the oven. That set-up was also prompted by the timer contacts wearing out and tacking themselves closed.

After adding the manual control, it was quite a bit of an effort doing manual control like that because of having to attend a lot to the cooking process. Also that was about the time that I began insulating the microwave containers to retain heat and cut power needed to cook by a large degree. The insulation greatly evened out the cooking process, conserved water in the food and container, and made food overheating more avoidable.

On a second oven was what I used the power pulser on. That was helpful because it can cycle on its own. Right now I have it adjusted for 5 seconds on and 37 seconds off, over a hundred times sequentially until the timer needs to be turned back up to 30 minutes. It also has a bypass switch to bring the food up to temperature so that the power pulser can take it from there with that switch turned back off.

I can see how sensitive reducing magnetron voltage can be because I was beginning with a 600 watt microwave on that earlier project. By the time I got the power down to just under about 400 watts, the efficiency began to drop very much and the magnetron began to overheat. I think Getting an 1100 watt oven down to still working at 400 watts seems pretty impressive too.
 

recklessrog

Joined May 23, 2013
985
Hi remember it is not wise to under run the heater voltage in the Magnetron. you would get more linear results if you keep the heater supply steady and just vary the ht. you could do this by using another old microwave transformer to power the heaters and only vary the voltage to the one providing HT. be careful though, lots of volts at high amps are available and could cook you quicker than a chicken! make sure you insulate very well any unused connections. DO NOT use a transformer that cannot handle the high voltage for the heater. If the insulation breaks down you will have fire works. Please Take care.
 

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Hi, my oven had defrost, low, medium, and high. But what happened was operation on a factory setting didn't cook the way I needed, and I first went to manual control of the magnetron by a manually operated switch. Another switch turned on the general power to the oven. That set-up was also prompted by the timer contacts wearing out and tacking themselves closed.

After adding the manual control, it was quite a bit of an effort doing manual control like that because of having to attend a lot to the cooking process. Also that was about the time that I began insulating the microwave containers to retain heat and cut power needed to cook by a large degree. The insulation greatly evened out the cooking process, conserved water in the food and container, and made food overheating more avoidable.

On a second oven was what I used the power pulser on. That was helpful because it can cycle on its own. Right now I have it adjusted for 5 seconds on and 37 seconds off, over a hundred times sequentially until the timer needs to be turned back up to 30 minutes. It also has a bypass switch to bring the food up to temperature so that the power pulser can take it from there with that switch turned back off.

I can see how sensitive reducing magnetron voltage can be because I was beginning with a 600 watt microwave on that earlier project. By the time I got the power down to just under about 400 watts, the efficiency began to drop very much and the magnetron began to overheat. I think Getting an 1100 watt oven down to still working at 400 watts seems pretty impressive too.
Hi again,

Oh ok, well i noticed too that the available control on the microwaves is very poor. When want to simmer i have trouble finding the right setting too.
I bought a Panasonic mw oven a while back thinking i was finally getting the real thing because it had true variable power not pulsed power, which varies the actual power to the magnetron. BUT to my suprize, then still only give you 10 power settings, which is ridiculous. That left me back at square 1 really, except for the more even cooking that it can do with constant power low power instead of pulsed high power to emulate low power.

Yes it's amazing that it works down to 400 watts, and today i tested it at 300 watts input power and it still works, cooking very very slowly though. I defrosted a pot pie earlier today :)
The only problem i see now is that because i am using a variac, there is still no regulation on the AC voltage and here it varies somewhat even over short time periods like 5 minutes. This means the power could vary by itself by as much as 50 watts. I am hoping that does not cause too much trouble because to regulate it i would have to buy or build an AC inverter/converter and i dont really want o have to do that.

But at least you know now that you can use a variac to lower your mw oven power down to levels that are basically anything you want to set it at. 20 percent, 10 percent, 11 percent, 12 percent, of full power, etc.
i got the variac originally for testing things. Cost is around 100 dollars USD for that one.
 

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Hi remember it is not wise to under run the heater voltage in the Magnetron. you would get more linear results if you keep the heater supply steady and just vary the ht. you could do this by using another old microwave transformer to power the heaters and only vary the voltage to the one providing HT. be careful though, lots of volts at high amps are available and could cook you quicker than a chicken! make sure you insulate very well any unused connections. DO NOT use a transformer that cannot handle the high voltage for the heater. If the insulation breaks down you will have fire works. Please Take care.
Hi there,

Thanks for joining this thread as i am hoping to hear from others about this too.

I would not attempt to modify that oven in that extreme way, but i happen to know that lowering the input voltage works pretty well and i found out by almost pure chance of circumstance.
While using my ovens in the past, i noticed that the cooking power went down as the line voltage naturally dropped in the summer months. Around here our line can drop regularly to only 90 volts ac, so all my ovens in the past lost power if it went down too low, or just starting cooking very slowly. So i noticied that way that lowering the input voltage causes a lower cooking power, and there were no secondary effects such as early failure of the oven, at least for the regular type of oven with the pulsing low power modes not continuous. I did not test the Panasonic model because that has continuously variable power settings where it does not pulse the magnetron it just lowers the power.
So after i got a variac some time back i decided one day i would try the lower voltage to the microwave trick to see if it would work on a regular oven. It worked, and because of the previous 2 or 3 models i used in the past that had low input voltage for many years, i figure it wont hurt it too much to run it that way.

Now as far as a linear control, that would be nice, and i appreciate the separate power to the heater idea, but i dont want to modify it that much. I would rather put up with the non linear control, and maybe one day i will build a voltage regulator that will keep the voltage fairly constant to it so i can keep a constant power setting for cooking.
I am still in the early stage of testing though, so i dont know how bad the non linear power setting and the somewhat varying power with line changes will affect my cooking over say a few months. After a couple weeks i'll have a better idea so there is a chance i wont need anything better than this.

What i do know now is that it works so now i have an oven that has continuously varying power level settings.
What is more is that i dont have to press a power button 7 times just to get to power level 3 as i had to do with the Panasonic oven, and that was a real pain in the azz when i used it a lot.
 

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Spot on - it'll strip the emissive coating off the cathode. This even happens if you defrost a lot on an old iron cored transformer type. The heater has thermal inertia - the voltage doubling rectifier doesn't.
Hi,

I would think that a hotter element would reduce the coating faster than a cooler element, but perhaps you can offer an explanation on the process by which the coating degrades over time with normal current and with lower current.
You might also give some information on the time line of this process, such as years to decades, vs operating point.

On hearing this my first thought is that it is a slow process depending on operating current level, and the true analysis would come from a comparison not an absolute measurement. The reason for this is because my experience, as mentioned in this thread, is that within a certain range of operation the degradation has to be minimal because of the long term use i have gotten in the past without significant problems involving early failure. My guess is that the typical operating voltage i am using is still high enough to allow almost normal operation anyway. The ovens have to be able to work at voltages that are lower than a perfect normal line voltage like 120vac and 100vac is only 20 percent lower, while 90vac is about 30 percent lower. We also have to weigh the normal wear and tear against the lower voltage wear and tear and wonder if perhaps it might actually extend the life of a regular oven.

So if you can offer some sort of explanation about the process that makes the coating degrade that would help understand this phenomenon a lot better.
 

recklessrog

Joined May 23, 2013
985
Hi,

I would think that a hotter element would reduce the coating faster than a cooler element, but perhaps you can offer an explanation on the process by which the coating degrades over time with normal current and with lower current.
You might also give some information on the time line of this process, such as years to decades, vs operating point.

On hearing this my first thought is that it is a slow process depending on operating current level, and the true analysis would come from a comparison not an absolute measurement. The reason for this is because my experience, as mentioned in this thread, is that within a certain range of operation the degradation has to be minimal because of the long term use i have gotten in the past without significant problems involving early failure. My guess is that the typical operating voltage i am using is still high enough to allow almost normal operation anyway. The ovens have to be able to work at voltages that are lower than a perfect normal line voltage like 120vac and 100vac is only 20 percent lower, while 90vac is about 30 percent lower. We also have to weigh the normal wear and tear against the lower voltage wear and tear and wonder if perhaps it might actually extend the life of a regular oven.

So if you can offer some sort of explanation about the process that makes the coating degrade that would help understand this phenomenon a lot better.
https://en.wikipedia.org/wiki/Hot_cathode
 

recklessrog

Joined May 23, 2013
985
Hi,

I would think that a hotter element would reduce the coating faster than a cooler element, but perhaps you can offer an explanation on the process by which the coating degrades over time with normal current and with lower current.
You might also give some information on the time line of this process, such as years to decades, vs operating point.

On hearing this my first thought is that it is a slow process depending on operating current level, and the true analysis would come from a comparison not an absolute measurement. The reason for this is because my experience, as mentioned in this thread, is that within a certain range of operation the degradation has to be minimal because of the long term use i have gotten in the past without significant problems involving early failure. My guess is that the typical operating voltage i am using is still high enough to allow almost normal operation anyway. The ovens have to be able to work at voltages that are lower than a perfect normal line voltage like 120vac and 100vac is only 20 percent lower, while 90vac is about 30 percent lower. We also have to weigh the normal wear and tear against the lower voltage wear and tear and wonder if perhaps it might actually extend the life of a regular oven.

So if you can offer some sort of explanation about the process that makes the coating degrade that would help understand this phenomenon a lot better.
www.chiark.greenend.org.uk/scopes/weyer.txt
 

recklessrog

Joined May 23, 2013
985
Hi,

I would think that a hotter element would reduce the coating faster than a cooler element, but perhaps you can offer an explanation on the process by which the coating degrades over time with normal current and with lower current.
You might also give some information on the time line of this process, such as years to decades, vs operating point.

On hearing this my first thought is that it is a slow process depending on operating current level, and the true analysis would come from a comparison not an absolute measurement. The reason for this is because my experience, as mentioned in this thread, is that within a certain range of operation the degradation has to be minimal because of the long term use i have gotten in the past without significant problems involving early failure. My guess is that the typical operating voltage i am using is still high enough to allow almost normal operation anyway. The ovens have to be able to work at voltages that are lower than a perfect normal line voltage like 120vac and 100vac is only 20 percent lower, while 90vac is about 30 percent lower. We also have to weigh the normal wear and tear against the lower voltage wear and tear and wonder if perhaps it might actually extend the life of a regular oven.

So if you can offer some sort of explanation about the process that makes the coating degrade that would help understand this phenomenon a lot better.
https://www.sweetwater.com/insync/cathode-stripping/
 

ian field

Joined Oct 27, 2012
6,536
Hi,

I would think that a hotter element would reduce the coating faster than a cooler element, but perhaps you can offer an explanation on the process by which the coating degrades over time with normal current and with lower current.
.
Its beaten track stuff with anyone who serviced CRT displays. The space charge around the cathode (or filament if directly heated) depends on how hot, in rectifiers and such; anode current sucked away an inadequate space charge and then set about the emissive coating. In lower current applications; the coating could become "poisoned" by residual gasses in what should've been a vacuum. Some genius at Philips launched a memo announcing that with stable SMPSUs; CRT heaters only needed 3.15V - recovering poisoned cathodes kept me in steady work for a few years. There were plenty of CRT rejuvenators on the market, but you had to be careful not to strip the cathodes - my methods were a bit more subtle and there was a root cause that I could fix on most examples.
 

recklessrog

Joined May 23, 2013
985
Now, without going into enormous detail, In the hey day of cathode ray tubes in televisions, there were two main types of failure. One was where the the tube had been used for so long that literally all the useful cathode coating had been used up and there was no way short of re-gunning the tube to restore operation. the second type of failure is due to cathode poisoning by build up and damage caused by positive ions striking the cathode.
The PDF below describes the process.
This "layer of contamination" could often be broken down by applying a large voltage to the grid whilst at the same time over running the heater to blast off this layer revealing fresh cathode material underneath. when you underun the heater, and it is not hot enough to freely boil off electrons, then the Ions that are always present, have an easier path to the cathode surface and causing contamination resulting in lower output from the cathode. so the only one that accurately describes "stripping" is the first one when there is no emissive material left.
Photo multiplier tubes should not be exposed to light because the cathodes (hundreds of tiny coated tubes) will be stripped by Photons knocking off the electrons from the coating.
 

Attachments

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Hello again Ian and Reckless,

I think what we may be seeing here is a change of technology in the development of the microwave oven vs the development of the regular cathode ray tube, and even possibly the normal operating points such as the difference between 3v and 5v for the heater voltage.

There are several aspects of this problem.

First, turning down the heater voltage alone could be much different than turning down the entire operating power level in the magnetron. Turning down the voltage of the heater and trying to maintain the same total power means that the heater does not get as hot BUT at the same time has to operate in the same way that it did before. This could be more detrimental than turning both the heater voltage down and turning the total power down.

Second, the design of the tubes are a bit different CRT vs Magnetron from what i have read. The magnetron has a very different coating than the CRT which is a different chemical and resists 'poisoning' better.

Third, the time line is not yet clear. We dont know (yet) if we are talking about an operating time of one month, one year, or ten years. Based on my experience and at least one other reader now that i have talked to, we have to consider a span of at least five years maybe more. The supposedly detrimental effect of running at low voltages may take years to show up in a significant way and so other factors may start to weigh in such as just wanting to get a better oven after say five years. If the effects are not that significant after five years then i would say that many people wont care if the oven degrades a little faster.

I say these things based on what i have read on the web in the links provided as well as my own personal searches, and most of all by more than 20 years experience with a few different brands of microwave ovens, as well as another reader that quotes long life at voltages that are even less than i apply myself.

I still think it is a good idea to look into this possible failure mode though.
 

recklessrog

Joined May 23, 2013
985
If you are happy to take the risk of shortening the life of the magnetron, why not just use a triac switched supply to the primary? a lot cheaper than a variac, but do use suitable filtering to prevent putting noise back onto the mains supply.

I cannot give precise information because although old, the equipment is still in use by the military. One radar "magnetron" (actually a derivative but similar enough) had the heater supply from a constant voltage regulating transformer. A variation of only 0.1volt would cause serious damage in only a few hours of operation. Microwave cookers in the U.K have to work from mains that can be as low as 200 volts to 250 volts. Ive had the same oven for 10 years, 5 of which were when I lived on a fairly remote farm where the best we got was about 220volts and was often only 205 volts. Where I am now has a sub station 200 yards away and my true rms meter shows 241 volts right at this moment. I can't say I notice any difference in the way the oven operates.
A good test would be to put a litre of water in a pyrex jug, by mixing hot and cold water, start with it at 25 deg C run the oven for 1 min and measure the temperature again and note the rise. Do the same again with a lower voltage and see what the temp difference is again or see how much longer it takes to reach the same temp as the first result. That would be interesting to know. Do take care though.
 

ian field

Joined Oct 27, 2012
6,536
Hello again Ian and Reckless,

I think what we may be seeing here is a change of technology in the development of the microwave oven vs the development of the regular cathode ray tube, and even possibly the normal operating points such as the difference between 3v and 5v for the heater voltage.

There are several aspects of this problem.

First, turning down the heater voltage alone could be much different than turning down the entire operating power level in the magnetron. Turning down the voltage of the heater and trying to maintain the same total power means that the heater does not get as hot BUT at the same time has to operate in the same way that it did before. This could be more detrimental than turning both the heater voltage down and turning the total power down.
Under running the heater means less space charge - when the anode current takes all the space charge; it makes a start on the coating.

Under running both LT & HT would possibly end in cathode poisoning.

My best guess is use the same technique as defrost, but make the off periods short enough to minimise heater cooling.

A relay probably wouldn't last long - neither would a triac unless you use a big mutha.............
 

strantor

Joined Oct 3, 2010
6,798
It's a very sensitive adjustment though because you reach a point where just changing the voltage a little makes the input power go down quite a bit, so the adjustment really has to be made while monitoring the current or power into the oven.
In addition to what has been mentioned about the nonlinearity already, my experience indicates that MOTs saturate early. I have made a few welders in the past and found that when operating them at 2/3 voltage (80V on a 120V xfmr) the no-load current draw was significantly lower. So that sharp drop in input power that you see, might not translate to a sharp drop in output power. It might just translate to a much more efficient operating mode.
 
The other issue is that the filaments are initially cold, so you have to compensate for that. I get the feeling that they cool off quickly.

So 50% power for 10 s and 50% power for 20 minutes is not the same duty cycle. Some service manuals have some of that info in them.

==

Off topic (a little)
Microwave probes have fallen out of favor, but I do like them. I have a microwave that needs some help getting it operational once I can find some time. If I can assume I can get it working. This is a big micro/convection oven.

The jack needs some modifying, The plastic thing inside the jack has to be changed to a new material and I want to lower the conduction and it would then accept standard 1/4 plugs. I basically have the parts to make those modifications. I do have an available probe that matches the R-T curve of the original thermister, but the "phone jack" is wierd. It has non-standard spacing.

The probe starts at like 115 F and goes to like 250 F or so from memory. That scale is too high. Suppose you want to warm milk for a baby or proof yeast. You can't.do so I want to propose a modification that say a setpoint of 115 is really 115-80 or whatever it turns out to be. This would be activated by plugging in the sensor and hitting a button on the back. Removing the sensor and your back to normal.

One part of the problem, then might be a thermister type changer. So, you get the idea.
 

Thread Starter

MrAl

Joined Jun 17, 2014
11,472
Hello again,

I'll reply to the previous posts all in one reply post here...


recklessrog:
When you say use a triac, do you mean like as in a lamp dimmer where we change the duty cycle for each cycle, or do you mean pulse it on and off like a regular microwave oven does?
One goal was to eliminate the long pulsing (like on for 20 seconds, off for 20 seconds) and the other goal was i would be afraid to use a triac on each cycle (sub cycle duty cycle) because it has to power a transformer and other electronics. Turning a transformer off mid cycle would mean some back emf transients which i dont like to see.
The difference you have seen in the past (0.1v difference causing damage in only a few hours) must be somehow different because like i said i've used other microwave ovens at lower than usual voltage for years and they all lasted at least five years. This could be because of improvements in the coating material.
I have tested the output power at 400 watts input power for an oven that normally runs at 1100 watts input power. The result was about 200 watts output (cooking) power. This represents a drop in efficiency from about 63 percent to about 50 percent at lower input power. Amazingly, even 200 watts cooks ok, just slower.

ianfield:
At present i can not explain why the oven does not fail soon then. Like i said, the only thing i can think of is maybe the improvements in coating material and the longer term effect meaning it is somehow a slower effect than what you guys have seen in the past with other devices. Perhaps the difference is five years vs six years for example, or even five years vs ten years. It could be that the power derates slower than we notice too so that the oven takes longer and longer to cook the same thing but we dont notice it that much over years.
Also, the coating used in microwave ovens is supposedly less subject to cathode poisoning, from what i have read now.
Oh you mean pulse the oven at a faster rate than they do? Like 1/2 second on, 1/2 second off for 50 percent? That's a very interesting idea. I was using a variac because for one thing i had one now, and also to eliminate transients.
I've used triacs in driving transformers before during experiments, but noticed that the transients can be large. It's been a long time since i did that though, back in the 1980's.

strantor:
Yes i measure the output (cooking) power and came up with an estimate of 200 watts at 400 watts input power. That represents a 50 percent efficiency which is down from 63 percent at full input power. So the efficiency does drop at lower input power. That is typical for power devices but i wasnt sure either so i measured it. Tolerance on the measurement was about plus or minus 5 percent.

KISS:
Not sure what you mean by 50 percent for 10 seconds is not the same as 50 percent for 10 minutes. The ovens i have used in the past all do the same regardless of what time you set. You can measure the seconds on and seconds off pretty easy.
Yeah i havent seen a MW probe since maybe the 1980's. I guess you would have to make your own, and use it to change the input power somehow. Would be interesting to be able to regulate the temperature of the food.
I also find that MW ovens really lack the features we have come to know in electronic devices. It's like they dont improve over time, they just keep doing the same stupid stuff. Case in point when i bought my Panasonic oven that has an inverter inside for continuous power adjustment, i assumed you could adjust from 1 percent to 100 percent but that's not the case. They STILL only give you TEN power levels! That's just plain dumb. The so called safety features also make it harder to use overall.
For a custom temperature sensor you'd have to figure out what could take the microwaves without arching and whatever.
 
Top