Derivation of fundamental electrical units

Thread Starter

paleotechman

Joined Feb 2, 2014
3
I've been pounding the web trying to untangle the historical derivations of fundamental electrical units. They all seem to be connected, so if you want to start in the beginning to establish the meaning of a volt, a coulomb, a joule, etc. where do you begin. We have standards, yes, but where did they come from? How were they established? Anyone have a line on this historical development?

Please don't post about the standards unless you can take it back to the beginning. For example, don't tell me that the volt is established as x volts using a Weston Cell. That's easy. How was that established?

If the question is not clear, please ask clarifying questions.

Thanks.
 

Thread Starter

paleotechman

Joined Feb 2, 2014
3
Eric,
That these units are named for scientists that worked with these concepts is common knowledge. If you had their tools how would you go about establishing the standards for the units. For example, the volt is _defined_ as the emf that will cause a current of 1 ampere to flow through a resistance of 1 ohm. You have no ohm or ampere to begin, how do you standardize the volt? Why is a coulomb about 6.241 x 10 to the 18th electrons?
You have to start somewhere. Do you understand that you haven't answered my question? Some definition is fundamental to beginning this process and allows everything else to be defined. Where does it start and how does it proceed?
 

Papabravo

Joined Feb 24, 2006
21,225
You might start with the wiki on Coulomb's law, which didn't take long to find.

http://en.wikipedia.org/wiki/Coulomb's_law

All physical quantities can be reduced to arbitrary units of mass, length, time, and charge. Constants of proportionality appear or disappear(by being equal to 1) based on the selection fundamental units. Volts, Amperes, and Ohms are not fundamental units.

Dimensional analysis is a fundamental technique of problem solving which can tell you quickly if you've got the formula correct.
 

studiot

Joined Nov 9, 2007
4,998
All physical quantities can be reduced to arbitrary units of mass, length, time, and charge.
The current international (MKS) standard uses the ampere as the fundamental unit and the coulomb as the derived one.

You should also add temperature and a light unit to this list.
 
Last edited:

MrChips

Joined Oct 2, 2009
30,808
Google SI units. Top hits are:

http://en.wikipedia.org/wiki/International_System_of_Units

http://physics.nist.gov/cuu/Units/units.html


Ampere

Original (1881): A tenth of the electromagnetic CGS unit of current. The [CGS] electromagnetic unit of current is that current, flowing in an arc 1 cm long of a circle 1 cm in radius creates a field of one oersted at the centre.

Current (1946): The constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed 1 m apart in vacuum, would produce between these conductors a force equal to 2×10−7 newtons per metre of length.
 
Last edited:

Thread Starter

paleotechman

Joined Feb 2, 2014
3
Papabravo, dimensional analysis is a valuable concept, but I don't see how coulomb's law gets me from fundamental units to the desired derived units. I suspect that the derivation of the ampere is the secret as Coulomb's experiments require a measured charge -- something we haven't derived yet in this discussion.

studiot, the SI has also added the mole as a quantity of substance and uses the ampere as fundamental and charge as derived. Understanding the "why" of this decision gets to my original question. I'm sure the answer is out there.

MrChips, the first definition of the ampere is not particularly helpful unless the means to measure the magnetic field in oersteds is first established based on fundamental units. The second definition for current (1946) suggests that some sort of magnetic balance that approximates the ideal conditions could provide a practical derivation. The link to NIST goes deeper to: http://physics.nist.gov/cuu/Units/current.html to give definitions of all seven base units at the present time. Interestly, the Wiki article you link got me to the article on the history of the metric system and it says that, "During this period, the metre was redefined in terms of the wavelength of the waves from a particular light source, and the second was defined in terms of the frequency of radiation from another light source. By the end of the 20th century, work was well under way to redefine the ampere, kilogram, mole and kelvin in terms of the basic constants of physics. It is expected that this work will be completed by 2014."
It will be interesting to see how these other four basic units are redefined.
 

nsaspook

Joined Aug 27, 2009
13,275
My personal feeling are more in this direction (not like it matters to the standards people) Amperes are an 'Operational definition' not a "fundamental" characteristic of electrical energy.
http://en.wikipedia.org/wiki/Operational_definition#Electric_current

It is the charge which is "real," while the current is a rate; a flow; an abstract concept.
http://amasci.com/miscon/fund.html

We usually measure material atomic lattice property modifications with electron volts and charge as the fundamental characteristics of electrical energy we need to control with high precision in most semiconductor processes. The actual 'current' flow while important is usually a factor of simple ohmic heating control or the optimization of time related effects.
 
Last edited:

studiot

Joined Nov 9, 2007
4,998
The point is that early units of electricity were introduced before the subject was understood to a sufficient stage.

See the experiments of Cavendish for instance.

http://en.wikipedia.org/wiki/Henry_Cavendish


During the second half of the 19th cent scientists collectively cleaned things up and eventually we moved in to the current MKS system in the second half of the twentieth century.

Several systems were used and superceded in the intervening century.
 

ericgibbs

Joined Jan 29, 2010
18,849
It could be argued that all our current 'electrical laws' are only 'provisional' laws and are based on early empirical test results.

These provisional laws will be updated as we expand our knowledge of physics and the ability to design more accurate measuring equipment.

For the time being the present 'laws' are the ones we have to consider as the standard.

E
 

studiot

Joined Nov 9, 2007
4,998
It could be argued that all our current 'electrical laws' are only 'provisional' laws and are based on early empirical test results.

These provisional laws will be updated .........
The OP asked about past history, not the future.
 

Kermit2

Joined Feb 5, 2010
4,162
Current was derived from magnetism. Unit strength of a magnetic pole is that strength which imparts an acceleration of 1 centimeter per sec to a mass of 1 gram.

At one centimetre radius from a pole of unit strength 4pi lines of magnetic force emerge. 4pi times the strength of a magnet gives the value of magnetic flux.

The intensity of the magnetic force at a distance from the magnetic pole equals the flux divided by 4pi times the distance.

Electric current produces magnetic flux. Unit electric current in a conductor produces a field intensity of 4pi. The unit of electric currentis defined as the current which produces a field intensity of 4pi per sq centimeter. The practical unit, or the AMP is made one tenth of this value. The unit resistance and unit voltage were defined from this unit current and unit magnetic pole strength. The unit value of e.m.f. is called the ab volt. The VOLT is defined as the abvolt times 10,000,000. Unit resistance is that value which, when unit e.m.f. is present causes unit current to flow. 100,000,000 times unit resistance is the practical unit of resistance called OHM.
 

ericgibbs

Joined Jan 29, 2010
18,849
hi studiot,
The point I was trying to make to the OP is that 'historically' most of the laws we use were derived as the result of empirical experiments and as such are provisional.

So his search for why such electrical definitions came in usage cannot be answered in absolute terms.

If you look at the answers posted by members so far, we are all giving 'examples' and not specific answers.

This is not a criticism in any way of their posts, I don't believe its possible to give a definitive answer to the OP.

Eric
 

studiot

Joined Nov 9, 2007
4,998
The first book on electricity and (magnetism) was published in 1600

De Magnete

W Gilbert 1600

He introduced Field Theory

The next development was Franklin's 'transferable elastic medium' theory in 1749.

This was followed by the experiments of Ampere, Faraday and Cavendish who established (experimentally) the inverse square law between them.

The electron was not identified until 1897 (Thompson)
 

Papabravo

Joined Feb 24, 2006
21,225
It seems relatively straightforward to start with fundamental units of length, mass, time and charge and look at the components of basic physical relationships derived either theoretically or empirically and understand how measured quantities can be understood in terms of those fundamental units. At least it is self-evident to me, but your mileage may vary.

As counter intuitive as it might seem the Ampere and the Volt can be understood in terms of flow and potential energy doing work on charges, thus length, mass, time, and charge.
 

LvW

Joined Jun 13, 2013
1,759
... historical derivations of fundamental electrical units.
... the meaning of a volt, a coulomb, a joule, etc.

If the question is not clear, please ask clarifying questions.
Yes - to me, it is not quite clear if your historical interests are mainly on electronic quantities (standards, basic units) or working electric circuits (inventors, corresponding equations).
 

WBahn

Joined Mar 31, 2012
30,060
I think the OP is asking a perfectly reasonable and interesting question. It has a definite answer, but that answer may be lost to the passage of time.

The ampere became the fundamental unit instead of charge for practical reasons -- we can measure current to a high degree of accuracy and precision, which is something that we cannot do very readily with charge (though we are much, much better at it today than a couple hundred years ago).

What the OP is basically asking (and he can correct me if I'm wrong) is to imagine taking a trip back in time to when work with voltages and currents and charges was in its infancy. If someone did an experiment in which they applied an electric potential to some object, how did they describe the amount of potential they applied so that someone else could reproduce the experiment? They couldn't say that they applied 4.2V because the volt hadn't been invented yet.

Also, the OP isn't asking about what they could have done or should have done, but rather what they actually did.

Going from an increasingly unreliable memory of what I read an increasing number of decades ago, one of the earliest attempts to come up with a measure of voltage was to apply the voltage to a muscle from a frog's leg that was connected between a fixed point and a spring and measure the tension that resulted as indicated by how far a spring was stretched. It is my understanding that this is where the notion of "tension" (as in "high tension lines") came from.

Of course, this was not a very accurate or precise measure, but it was something.

The history of measurement is generally not about figuring out a way to measure what we want to know, but rather figuring out a way to convert what we want to know into something we already know how to measure.
 

WBahn

Joined Mar 31, 2012
30,060
By the end of the 20th century, work was well under way to redefine the ampere, kilogram, mole and kelvin in terms of the basic constants of physics. It is expected that this work will be completed by 2014."
It will be interesting to see how these other four basic units are redefined.
It's my understanding that, for quite some time, the only remaining fundamental quantity that was still defined by a physical artifact is the kilogram.

The challenge isn't so much coming up with a definition. The challenge is coming up with a definition that is useable -- meaning that it has to be something that can be reproduced with a sufficiently high degree of both accuracy and precision.

I seem to recall that the ampere was (or perhaps is going to be) replaced by the volt because we can produce a voltage with an extremely high degree of accuracy and precision using Josephson junctions in the presence of a specific microwave radiation, which would give it a resolution comparable to the definition we already have for length and time.
 
Just giving a rough overview of the origin of terms as i was taught them by a EM physics professor i was very fond of.....

When "charge" was first dicovered, most scientists goals were simply in attempting to one-up each other with how much of the stuff they could cram into a space (electrostatics). Essentially, they were all just playing with DIY capacitors. At the time, they thought "charge" was sort of a magicaly electrical fluid, because you had to rub stuff together to get it to move around. Ben Franklin decided that there were to kinds, calling them a positive and negative. I think it was a british or irish guy that determine (by using statically charged oil drops) that it was quantized, i.e. found that all "charge" was a multiple of a fixed value (the electron, though they didn't quite know that).

Then people started playing with current. Faraday used his own measure of charge, now called a "faraday coulomb" i think, that was based on avagadros number (a mol). With current also came playing with magnetic feild, then shortly after, flux from a rate of change of current.

When they decided to actually assign "fundamental" units, current became the base because it is pretty much the easiest to measure. You cant exactly count electrons, so you can't watch how fast they move, or how hard they are pushing each other. What you can do though, is directly measure, with a high degree of accuracy with even the most basic of tools, the effect of current (did it in a 30 minute lab with some small weights). You put to wires with current running antiparallel, and see how much force they throw around. After that pretty much everything was derived from there. A coulomb is an amp running for 1 sec, volts can be found by using amps to pump coulombs into a capacitor, power by seeing how hot volt*amps make a wire get, and so on.
 

Metalmann

Joined Dec 8, 2012
703
"...was to apply the voltage to a muscle from a frog's leg that was connected between a fixed point and a spring and measure the tension that resulted as indicated by how far a spring was stretched. It is my understanding that this is where the notion of "tension" (as in "high tension lines") came from."



That's the way we were taught when I was a kid.
Doing that experiment in class, brought some screams/gasps from the girls in class.;)
Wish I would have learned more about electricity back then, instead of starting out again at 76.:D
 
Top