Researching the history/compatibility of 120v DC/AC utility power

Thread Starter

DMahalko

Joined Oct 5, 2008
189
Yes, I made an error, a boo boo, a mistake on terminology. I admit it. I shall now step away and resign myself to washing cars for a living. :rolleyes:

"Linear DC is regulated DC, but not all regulated DC are linear DC. Bridge-rectified filtered DC is not regulated DC or linear DC."



I still think this is an interesting topic for exploration / discussion.

If a switched mode "AC only" power supply has rectification to DC, filter caps, then its own AC frequency generator before it hits the isolation transformer... then it will work on DC. DC of any polarity will "rectify" to DC and away we go.

SMPS are becoming popular everywhere now for energy efficiency purposes and smaller/lighter design, and most everything with a SMPS can probably handle DC as input at the same voltage as AC RMS without any difficulty.



Yes, anything that depends on the specific properties of AC power such as the frequency, won't work and will act as a resistive element on DC. Induction motors, synchronous motors, shaded pole motors simply won't work, and may burn up from overcurrent.

But it appears theoretically possible to have a dual purpose AC/DC "resistive/inductive ballast" which is a wire wound resistor that acts inductively on AC and resistively on DC. Though it is likely that most inductive ballast AC fluorescents probably don't bother.


A fume hood with forced-air exhaust is probably a good idea for initial testing.
 
Last edited:

Wendy

Joined Mar 24, 2008
23,429
You are making mistakes, and keep on making them, then dismiss them as trivial while expect people to buy into your arguments, whatever those arguments are.

Not a sound way to convince people of your bona fides.

BTW, ripple is core to the SMPS design, unlike linear power supplies. With digital electronics it doesn't matter much, but with analog circuits ripple does matter. Most power devices, such as motors, don't care about ripple.

A dual purpose ballast won't work well, and the reason is simple. As I have already said, reactive current limiting generates no heat. This is a direct boost to efficiency. The moment you add resistive elements there will be heat. Worse, both elements (reactance and resistance) will combine to decrease current even further. It is not linear, reactance with resistance has it's own rules, but they will add. Getting a ballast that will work the same for AC and DC is not going to be very practical. You are taking a very simple component and replacing it with a lot of circuitry to replace a working standard with what?

If you want to learn the exceptions to that rule look up SMPS current regulators for LEDs. A common name for them is buck pucks.

Since we can convert DC to AC with increasing efficiencies over time, why not keep the old standards that work well and do what we are already doing, which is covert DC sources into household AC current?

Truth, I don't care one way or another. You will note this attitude among several other folks here, it isn't worth arguing. I'm not even sure what this thread is about.

I find that when teaching, a little humility and willingness to learn from those who know more than I goes a long way. I am good with many things, but I don't claim to know everything, and I learn more that way. A lot of folks here fit that description, that is why this is one of the premier learning sites on the web. If you want to learn, teach.

My job here on AAC requires I keep an eye on things. Just a thought, why not actually define what you are trying to prove? The battle of the currents has been fought, the decision was made based on the merits of the case. That argument is over.

You have told us what you want to do. I'm more curious as to the why. People come here to learn something or help teach other people new things. If it is trolling it becomes my job, AAC is a flame free zone and I help keep it that way, but I get the feeling you have a point somewhere. I am very interested in hearing it.
 
Last edited:

Thread Starter

DMahalko

Joined Oct 5, 2008
189
The back-forth of this discussion reveals interesting details about DC vs AC as a "utility power source", that to me are worth exploring and understanding.



The general history is that DC lost the battle as a "utility voltage source" in about 1890-1895 mainly because of a technicality of the times, that DC/DC voltage conversion was not possible at the time without a huge spinning mechanical rotor, but AC/AC voltage conversion could be done with a transformer with no moving parts and no maintenance.

Edison's plan was to generate power AT utility voltage, but power plants would then only be a few miles apart. Transformers allowed hundreds of miles between the plant and the consumer using high voltage at low amperage.

That is the specific reason Westinghouse and Tesla "won" the War of the Currents.



Now...... well it doesn't really matter any more. With a DC/DC voltage converter, which is basically a solid-state converter like a transformer, DC could again be used as a utility voltage source in homes.

In many cases the DC/DC converter is both smaller and more energy efficient than just a transformer by itself, and it doesn't have to draw power all the time like a transformer, which constantly keeps running power through the primary side of the coil to maintain the magnetic field even if there's no load.

(I wonder how much power the average 25,000 VA AC power line transformer wastes when there is no customer load, that the power company has to just "eat" because it is an unmetered load. And which is repeated for EVERY SINGLE customer down the line, across thousands of customers.)

In 1895 an 800,000 volt DC transmission line would have been prohibitively expensive and high maintenance, if not downright impossible. Now with solid state technology, eh, it's not such a problem... and actually cheaper to operate than an AC transmission line: (page 3 of document)

http://www04.abb.com/global/seitp/seitp202.nsf/0/5392089edc1b3440c12572250047fd78/$file/800+kV+DC+technology.pdf



The circuit breaker discussion is interesting in that I have learned that residential-grade breakers have been made simpler and cheaper to cover AC only, vs that slight bit extra needed to handle DC arcs.

It is doubly interesting since AC breaker voltage is expressed in RMS terms, and an RMS AC breaker needs to potentially cut off an arc of higher peak AC voltage and amperage vs DC. I need to explore this further.

Fuses meanwhile are readily available that still have 10kA breaking ability for DC or AC, and this is STILL true probably just because it's hard to cheapen the design much of a component that is already just a strip of wire between two contacts wrapped in a ceramic material.

From looking at the history of circuit breakers posted in this thread, it looks like one of the reasons for developing circuit breakers for residential/commercial use is that the code writers wanted overcurrent protection to be solely in the hands of electricians to install and approve, vs those unreliable end-users that would moronically stick a penny under a fuse that keeps blowing on an overloaded circuit.

If they could, the code writers would probably eliminate the ability for end users to use fuses completely, and force circuit breakers to be installed by professionals as the only option.

That is interesting stuff, and should probably be in a publicly accessible Internet encyclopedia somewhere. Though it would need more research and citations/references.



The discussion that the National Electric Code is not the be-all and end-all source for AC/DC electrical safety is also interesting. I acknowledge the Code is basically a "do it this way and don't ask why" book.

It does not concern the reader with the history and details of why the Code was written the way it is. It's simply a rulebook for non-intellectual laborers to follow that says "if voltage is more than X, then you do Y and Z, and nobody can sue you if the house burns down. You followed our rules without knowing why you followed them, and that's all that matters."

I am interested in the Why from which electric codes are created, plus also the history of those decisions, but apparently I'd need to be shelling out big bucks to be a member of NFPA to get access to those notes and details.

Still, analysis of why the NEC was written the way it is are possible. I wrote much of what has gone into this section:

National Electrical Code - Details of selected NEC requirements
http://en.wikipedia.org/wiki/National_Electrical_Code#Details_of_selected_NEC_requirements
 
Last edited:

takao21203

Joined Apr 28, 2012
3,702
It's simply a rulebook for non-intellectual laborers to follow that says "if voltage is more than X, then you do Y and Z, and nobody can sue you if the house burns down. You followed our rules without knowing why you followed them, and that's all that matters."
Your attitude is actually quite threatening. Even if you are only a bit curious. This is why some folks are afraid of curious people.
 

Thread Starter

DMahalko

Joined Oct 5, 2008
189
The argument of DC vs AC utility power is not really over. Probably just not reconsidered by Industry Professionals Whose Opinions Matter, in terms of modern solid-state technology.

The total wasted power of all AC powerline transformers combined globally is an interesting question. When no one is drawing any load at all there is still likely significant power consumption keeping that waveform going in all transmission lines, right down to the customers homes. This includes transmission line current losses to actually get power to those idle transformers.

Vs a DC transmission grid? No customer load means the voltage rises to maximum on the line and there is no power flow. A solid-state DC transmission grid, with small storage capacitors on the output side of DC/DC converters to maintain the downstream line segments, would have an initial flow to charge up capacitors but then the flow stops when the DC/DC transmission converters go into power-save mode, until a load appears on the line.

People are up in arms about tiny unregulated wall-wart transformers consuming a few parasitic watts power, and that we need ones that turn off with no load being drawn. Yet the biggest warts of all, the AC transmission line transformers, are being overlooked? Odd.

It's going to be fun to see how long it takes for DC power to come full circle to, gee, we could cut lots of parasitic utility power costs if we switched to DC transmission for everything, right down to the end-user.

This might go well with that "smart grid" concept that keeps getting waved around.



As far as my own "mistaken assumptions" at the start of this thread, I acknowledge from the discussion here that residential AC breakers likely won't work reliably for interrupting DC at the same voltage as RMS AC, and that fuses with high AC/DC breaking capacity would be the way to go until reasonable cost 125vDC circuit breaker alternatives can be found.

I misspoke about unregulated power supplies vs linear to user #12. For that, I apologize. However my description still stands, the unregulated DC bridge output voltage is all over the place depending on device load current and transformer output capability.

I see that it is possible that a diode bridge in a switched mode AC-only power supply may overheat when fed DC, since only two diodes will remain active continuously, vs a 50% duty cycle for all four with AC input. However, the capability for DC input still exists and may be as minor as clipping some heatsink fins onto the existing diode bridge.
 
Last edited:

WBahn

Joined Mar 31, 2012
30,076
I don't know if anyone has mentioned this, but one thing that is extremely widespread (several in virtually every home) that won't work on DC are the digital clocks. Why? Because virtually every clock that can be plugged in uses the 60Hz power line as its time base. That is what allows the cheapest clocks to keep extremely accurate time for years. The power utility tightly regulates the accumulated error that a clock will have and speeds up or slows down the generators several times a day in order to counter the natural dips and surges as loads come on and go off line.

For this and lots of other reasons, switching to DC would be a hugely disruptive technology and is not likely to happen anytime soon, no matter what its merits may or may not be. Now, over time that might change as more and more people and, particularly, communities make investments in off-grid DC capabilities. But I suspect that the best you are likely to see will be infrastructure changes that stop short of the last mile and will continue delivering AC to the end user, unless specifically requested otherwise.
 

Thread Starter

DMahalko

Joined Oct 5, 2008
189

WBahn

Joined Mar 31, 2012
30,076
The North American Electric Reliability Corporation is exploring dropping the AC utility frequency stability requirement.

Utility frequency - Long-term stability and clock synchronization
http://en.wikipedia.org/wiki/Utility_frequency#Long-term_stability_and_clock_synchronization

  1. Power-grid experiment could confuse electric clocks, retrieved 2011 July 6
"Is anyone using the grid to keep track of time?" McClelland said. "Let's see if anyone complains if we eliminate it."
Yes, I'm aware of that. Their approach is either very clever or very naive. For the overwhelming majority of end users, are they going to think to complain to the power company if their clocks stop keeping good time? So they will be able to say with a straight face that it must not matter, because hardly anyone complained!

But I can see how it will develop:

At some point, someone like Nightline will do an expose on how the clock industry, in order to line its pockets, is foisting cheap clocks on an unsuspecting public that use the power line variations to try to keep track of time and, as anyone with a brain should have known, those variations are going to, well, vary. That nearly everyone's clocks started going bad at the same time will be seen as proof of collusion among the Big Time companies. After a bunch of Congressional hearings, in which a bunch of actors will be called on to testify as expert witnesses because they had all played leading roles in movies about time travel, the Congress, in emergency session, will pass the Federal Intervention to Assure Stable Clock Output bill, which will be signed to much fanfare by the president with a bunch of children behind him that had been victims of Big Time when their TiVos didn't record the beginning of the Justin Bieber special.
 

Wendy

Joined Mar 24, 2008
23,429
One of the several reasons the War of the Currents turned out like it did is small power plants are not economical. It was not a consideration at the time, but they would be an ecological disaster today, and would be less useful where we use power by the gigawatts. You can not bring Hoover Dam to the city.

The whole point is compared to AC techniques, there is no equivalent DC techniques. They don't exist except in very small scales, and would have to be invented from scratch. If you were to try sending low voltage DC the losses would be unsustainable, unlike AC. So your whole house of cards fall down.

Transformers were not a minor invention. They allow the transmission of electricity for many hundreds of miles with losses that are considered reasonable. We may need to rethink some assumptions long term, but it is still true. If there are anything like room temperature superconductors invented, which we are a long ways from if ever, then maybe there will be a rethink, but this is science fiction at the moment.

BTW, those spinning rotors for DC/DC conversion pre-electronics era? Want to bet they converted the electricity to AC, then used a transformer? Cut out the middle men, and all the associated losses.

The fact is for central power generation DC will not work, even today even with all our technology. Transformers are really simple and can be completely recycled. They are cheap. They are green. You want to replace that with some very expensive electronics that is none of the above. For local power generation, such as windmills and solar cells DC makes sense, but the standards have been set. This makes anything less a very hard sell indeed. And there are no cost benefits, which would be the only reason to do this.

I could imagine a proposed change over. The huge number of appliances replace or (less likely) retrofitted by every home going through the change over. The fires and damage done by places that fell through the cracks. It would be easier to just replace a government trying to implement this. Doing something like this is not minor, and like I said, there is no benefit to it. You are blithely glossing over many facts in this thought experiment, just because you ignore them doesn't mean they won't bite you where it hurts.

A likely scenario where a slow conversion could take place is if solar cells (or some other technology) where the local energy was harvested. However, the odds are still pretty high you would convert your surplus to AC electricity and sell it back to the power company. Transformers work both ways, and storing excess production is an intractable problem. What you would likely end up with is appliances slowly converted to dual use, something that simply does not exist today.
 
Last edited:

Thread Starter

DMahalko

Joined Oct 5, 2008
189
The whole point is compared to AC techniques, there is no equivalent DC techniques. They don't exist except in very small scales, and would have to be invented from scratch. If you were to try sending low voltage DC the losses would be unsustainable, unlike AC. So your whole house of cards fall down.
I am not saying run low "utility voltage 120/240v" DC everywhere. I am saying use high voltage DC the same as high voltage AC is used now, and the pole transformer becomes a step up or step down DC/DC converter.

Whether AC or DC, a high transmission voltage permits low amperage so the transmission wires can be tiny and relatively inexpensive, but carry high wattage.

120v RMS AC or DC @ 100,000 amps = 300,000 volts RMS AC or DC @ 40 amps = 12 megawatts
 
Last edited:

Wendy

Joined Mar 24, 2008
23,429
I am not saying run low "utility voltage 120/240v" DC everywhere. I am saying use high voltage DC the same as high voltage AC is used now, and the pole transformer becomes a step up or step down DC/DC converter.
Pick one, you will not have both. Power only flows through it one way on such a setup, so you can't sell juice back to power company, and it is very expensive, compared to a transformer. It will be much less reliable.

Whether AC or DC, a high transmission voltage permits low amperage so the transmission wires can be tiny and relatively inexpensive, but carry high wattage.

120v RMS AC or DC @ 100,000 amps = 300,000 volts RMS AC or DC @ 40 amps = 12 megawatts
Transformers, unlike power electronics, scale up and down nicely. Not so electronics. Most electronics has pretty rigid voltage rating as well as current ratings. Neither scales up well, electronics is much more delicate. A transformer is likely to survive electrical abuses, such as a lightning strike.

In short, this is not an arbitrary decision made by fat cats, there are strong technical reasons these standards exist.
 

strantor

Joined Oct 3, 2010
6,798
In many cases the DC/DC converter is both smaller and more energy efficient than just a transformer by itself, and it doesn't have to draw power all the time like a transformer, which constantly keeps running power through the primary side of the coil to maintain the magnetic field even if there's no load.
What's kind of a DC/DC converter? Have you seen inside a big one? The DC/DC converter that I'm imagining, that is large enough for the job, has a transformer inside of it. So, basically the same AC transformer, plus a few additional power wasters and unnecessary conversions.

(I wonder how much power the average 25,000 VA AC power line transformer wastes when there is no customer load, that the power company has to just "eat" because it is an unmetered load. And which is repeated for EVERY SINGLE customer down the line, across thousands of customers.)
LOL, that's cute, to think the power company would "eat" anything. They pass the buck onto the customers.
http://www.copper.org/applications/electrical/energy/trans_ad.html

Many transformer units have actual conductor watt losses that are three to four times core watt losses. Such conductor losses can range from a low of near 130 watts to 350 watts or more for a 25 kVA unit.
So, worst case scenario,
true power = apparent power.
conductor loss = 350W
conductor loss = 3X core loss, so core loss = conductor loss / 3, or 120W
total loss = 350W + 120W = 470W
efficiency = 100-[(470W / 25,000W) * 100] = 98.12% efficient.
IIRC that's around what Bill_Marsden already stated.

The total wasted power of all AC powerline transformers combined globally is an interesting question. When no one is drawing any load at all there is still likely significant power consumption keeping that waveform going in all transmission lines, right down to the customers homes. This includes transmission line current losses to actually get power to those idle transformers.
The question I find more interesting is, when are any of these line tranformers ever idle? When are people ever not using power? Never. And your argument only holds water when the the users are not using, which is never, so your argument holds water, well, never. When you compare the amount of what's wasted by 97-98% efficient transformers to whats used by end users (≠0), the amount wasted is minimal, insignificant.

This also marks the second time You've brought up the wasted power of "keeping the waveform going". What on earth are you talking about? The waveform is just polarity reversal, it doesn't take any power to reverse polarity.

Vs a DC transmission grid? No customer load means the voltage rises to maximum on the line and there is no power flow. A solid-state DC transmission grid, with small storage capacitors on the output side of DC/DC converters to maintain the downstream line segments, would have an initial flow to charge up capacitors but then the flow stops when the DC/DC transmission converters go into power-save mode, until a load appears on the line.
Again, when will the customers ever draw no power? they always will, and with DC, they would always be drawing it from <97-98% efficient DC/DC converters.


What's the most efficient DC/DC converter that you know of that is big enough to do the job? Even if it doesn't have a 'transformer' inside of it, it has to have some kind of coil, no way to get around that.
 

Thread Starter

DMahalko

Joined Oct 5, 2008
189
Now you're just arguing for arguing's sake. No, I have not personally seen a transmission line DC/DC regulator. Happy now?

I guess this 800,000v DC transmission line must be a myth. Efficient solid state electronics large enough to make that work probably don't actually exist.
http://www.siemens.com/press/en/pre...spicture/pictures-photonews/2010/pn201001.php


"Keeping the waveform going", means powering the input side of an AC transformer, whether or not there's a load on the output side. Power is needed to keep the AC magnetic field constantly cycling through the input coil, and it doesn't stop regardless whether there's a load or not. The losses from this are cumulative going back to the power generation, which has to keep wobbling the magnetic field of thousands of distant transformers regardless of whether there's a load on the output side of those transformers.

Small scale DC to AC inverters have similar losses, due to having to constantly run the inverter oscillator and power transistors, to maintain an AC output waveform regardless of whether there's an actual load.


Yes, people freely waste power with the power grid the way it is now, because there's really no incentive to conserve with the flat-rate "dumb grid". Now if cost per kw actually swung up and down with actual utility generation load, well, conserving might actually make some cents.


Efficient bidirectional DC/DC conversion for grid-tie generation may be possible. Has anyone tried developing anything like this yet? Probably not.


Since this DC transmission line aspect of this discussion is an area where nobody is an expert (and few people likely are) I'm inclined to just let this part go idle.

Moving along..
 

strantor

Joined Oct 3, 2010
6,798
I doubt anybody has seen a DC/DC converter of that size. My point was, that a DC/DC converter of that size is would likely be nothing more than your standard line transformer with switching semiconductors on either side of it.

The biggest DC/DC converter I can think of is the type used in EV conversions which take HV DC from the car's battery bank and steps it down to 12V for accessories; headlights, radio, fans, etc. EV builders often use seperate 12V batteries to provide this power in order to avoid using these surprisingly inefficient converters.

I think you're going on the mistaken assumption that DC can stepped down in power applications more efficiently than AC. You said (I'm going to dig through and quote it) that DC/DC converters are more efficient than transformers; I can't think of one that's >97% efficient, but if there is one, I can almost guarantee it is a very low power device, not suited for power distribution.
 

strantor

Joined Oct 3, 2010
6,798
Now you're just arguing for arguing's sake.
for the record, I'm arguing for the sake of your future wiki article being correct. But you seem to prefer it being biased over being correct. I thought writers of such articles were supposed to be objective, scientific, and unbiased. So far all I've seen is you engaged in verbal combat with people who are trying to steer you in the right direction.
 

takao21203

Joined Apr 28, 2012
3,702
Actually Wikipedia is not a crystal ball and not a research facility. They have an Original Research policy.

He (the OP) is interested to research history of DC/AC, but also somehow DC systems should be favoured from now on.

This disqualifies the information as wikipedia article.

I don't know if they are keeping things literally over there, I am not using Wikipedia.

And even if he wants to do research, and even if he favours DC systems, he should write a neutral article concerning the history, or don't do it at all actually.

Generating 120V DC from 120 V AC is trivial and somehow effin*.
And, many appliances actually will not work as intended. Microwave ovens won't work at all.

*I mean what's the point knowing all the N.E.C. and throwing terms like 800000 volts dc/dc converters and kilo Amps, but not knowing what is a Variac (for instance), that would be appreciate to get 120V DC (and not 170V).

And then he introduces more subjects like for instance non-intellectual labourers. They may exist, not only in World of Warcraft. Well usually in the morning the boss will come up with some cable, and that has to be used no matter what. So they don't even look up the N.E.C.

From my experience these people are called "Peon" (http://en.wikipedia.org/wiki/Peon), not non-intellectual labourer. That would be politically biased, as for WoW. I have not yet seen much reference to the class struggle, and it's very unlikely they will add it.
 
Last edited:

Wendy

Joined Mar 24, 2008
23,429
To add to this, you have come to a nest of geeks, and some of us are experts. You seem not to like the answers that are given, and are trying to work around them, this is bias.

Even if superconductors were perfected tomorrow your world could not exist, too much infrastructure would have to be replaced. What would happen is the superconductors would be incorporated, they work fine for AC too.

You are not talking billions of dollars for the changeover, but trillions. I have shown a scenario where it could happen, it would have to be slow as the appliance upgrades were implemented. I still rate this as a low scenario, it is far more likely DC power generation would simply have a power inverter (a tech fully developed and implemented, no invention or start-ups required) attached.
 

WBahn

Joined Mar 31, 2012
30,076
Even if superconductors were perfected tomorrow your world could not exist, too much infrastructure would have to be replaced. What would happen is the superconductors would be incorporated, they work fine for AC too.
Actually, superconductors do not like AC too much and AC losses can be pronounced. So there would be a trade off between lossless DC transmission, coupled with losses at the front/back end to do the AC/DC and DC/AC conversions, and transmission losses in an AC system but without the conversion losses. My guess is that long haul will favor DC but sufficiently short haul would opt for AC.
 
Top