[Solved] The progress of EE, how did we get high precision from lower precision and faster from slower?

Thread Starter

ballsystemlord

Joined Nov 19, 2018
160
Hello,
Now I'm sure this info is out there someplace (please no wikipedia links, they're not a good source as wikipedia themselves admit), but I've not yet found it.
When resistors were first made, we got three classifications of tolerance for those carbon resistors, 20%, 10%, and 5%. Likewise with oscilloscopes the originals were limited to a few Khz AFAIK.
So how did they managed to improve things? I mean, without a precise reference point, how do EEs know if the part is more precise or less precise?
If you only have oscilloscopes capable of handling only a few Khz, how do you judge that your new oscilloscope design can accurately measure a higher BW than that?

Thanks!
 

KeithWalker

Joined Jul 10, 2017
3,093
There are things called electrical standards which are based on very basic physical measurements. From these can be derived the standard volt and amp and other electrical units. The heating effect of the power can be measured and used to accurately calculate the power of an AC source, at any frequency. Using the basic standards will allow us to deduce the absolute accuracy in electrical measurements..
 
Last edited:

Papabravo

Joined Feb 24, 2006
21,225
Temperature is another example where, by definition water freezes at 0 °C and boils at 100 °C. Temperature measuring devices can be matched to these two known conditions.

WRT to oscilloscopes we can fabricate very accurate oscillators and measure the performance against those devices.
 

WBahn

Joined Mar 31, 2012
30,058
Temperature is another example where, by definition water freezes at 0 °C and boils at 100 °C. Temperature measuring devices can be matched to these two known conditions.
Actually, that is not how temperature is defined (anymore) and that, in and of itself, is a perfect example of what the TS is getting at.

Given the ability to manufacture and/or measure something to a certain level of accuracy/precision, there are techniques whereby we can use those to manufacture and/or measure something to a somewhat better level of accuracy.

For instance, if I have the ability to measure dimensions to the nearest millimeter, I can use those to build a machine (such as a mill or a lathe) that I can use to produce measurement tools that can measure better than that. How much better, I don't know, but it doesn't have to be much better. Let's say that the uncertainly was reduced by just 10%, then after repeating this 22 times, I have tools that can measure to 0.1 mm. I'm pretty sure the improvement per generation is significantly better than that.

One key method that is used is the ensemble. For an easy to visualize example, imagine taking 1 kΩ resistors that have a 20% manufacturing tolerance and obtaining lots of them from different places. The goal is to randomize the errors. Now take one hundred of them and put them into a 10x10 grid. The result is an ensemble of resistors whose measured value is expected to be within 2% of the mean value of that set of resistors (which is nominally still 1 kΩ). If the errors have been sufficiently randomized by getting them from different lots from different manufacturers, the mean of those hundred resistors would be expected to be very close to the nominal value. If you create an ensemble that is 20x20, you would expect this to be within 1%.

Even if you have systematic errors in the distribution, those would affect two ensembles the same way. So if you created two ensembles of 400 resistors each and used those to form a voltage divider, you now have a divider that is formed by two resistors that are matched to each other within +/-1%.

Time is another example in which an ensemble is used. There is no single clock that defines what time it is. Instead, more than 300 clocks located at more than 60 timing laboratories around the world are used to obtain a weighted average (the more precise clocks are weighted more heavily). The result is a time measurement that is better than even the best individual clock is capable of, but that gives us a way to measure the performance of the next generation of clock that is better -- at which point we use those newer, better clocks in the ensemble to again have a better measurement than any of the newer, better clocks can do.

So measurement accuracy and precision improve over time and the standards that define them evolve accordingly.

Pretty much throughout time, but particularly in the last 500 to 1000 years, there have been people whose entire lives were devoted to improving our ability to measure things (known as the field of metrology). In the last couple centuries these have become more more formalized and today significant resources are put into national measurement laboratories as well as corporate capabilities because of the scientific, technical, and economic impacts involved.
As our ability to make measurements improves, we run into the problem that the standards against which the measurements are made now have too much uncertainty in them, so we redefine the standard to make it more accurate and able to be replicated more precisely for reference purposes.

Temperature is as good an example as any. It's fine to declare that the celsius temperature scale is defined such that 0 °C and 100 °C are the freezing and boiling points, respectively. But how accurately can we measure when water freezes or when it boils? Back when our measurement limits were a good fraction of a degree celsius, this was good enough. But if we can measure to a thousandth or a millionth of a degree celsius, this definition is useless. This is why the definition shifted to the triple point of water and defined it to be exactly 273.16 K. At that point, if we can measure the temperature at that point (which does not involve detecting a dynamic phase change, but rather a very static and stable condition) to 0.1 K, our definition of 1 K is good to better than five sig figs. We then define a one degree change in celsius to be equal to a change of one kelvin and define the zero of the celsius scale to be exactly 273.15 K in order to park the 0 °C and 100 °C points acceptably close to the original definition.

But now the question of which water gets used. At first, it was just distilled water that had a purity level above a certain amount (I think that was determined by getting the conductivity of it below some threshold, but I'm not sure). However, at some point our ability to make measurements reached the point where differences in the triple point of a sample of water were impacted to a discernable degree (no pun intended) by the differences in the isotopic make-up of the water. So then came the Vienna Standard Mean Ocean Water that was originally defined as water collected from a variety of fixed ocean locations around the world and then purified. But even that eventually had too much uncertainty, so they defined the specific ratios of isotopes that the water had to have.

But that also had its limits. Plus, there is a push to remove all physical aspects from the definitions. In addition to making the definitions exact, this also has the benefit that we could, in theory, transmit our definitions to some newly discovered alien civilization and they could reproduce our system of measures. On a more mundane level, it means that any laboratory anywhere can (if they have the equipment) produce measurement standards and that we are no longer constrained to things like the meter being defined as the distance between two scratches on a particular platinum/iridium bar at the melting point of water that happened to be located in France (and this was the definition of the unit of length until 1960), The definition of the mass, the kilogram, was a physical hunk of metal that had been determined to be changing over time. It was finally replaced by a definition based on fundamental constants less than five years ago.

So now the kelvin is defined in terms of the Boltzmann constant and this constant was defined to have an exact value, that value being chosen such that the triple point of water is still 273.16 K to within the present limits of metrology. At this point, the definition of temperature is completely independent of any physical properties of water (or anything else).
 

MrSalts

Joined Apr 2, 2020
2,767
Hello,
Now I'm sure this info is out there someplace (please no wikipedia links, they're not a good source as wikipedia themselves admit), but I've not yet found it.
When resistors were first made, we got three classifications of tolerance for those carbon resistors, 20%, 10%, and 5%. Likewise with oscilloscopes the originals were limited to a few Khz AFAIK.
So how did they managed to improve things? I mean, without a precise reference point, how do EEs know if the part is more precise or less precise?
If you only have oscilloscopes capable of handling only a few Khz, how do you judge that your new oscilloscope design can accurately measure a higher BW than that?

Thanks!
42MHz FM radio has been around since 1936, long before oscilloscopes we're available that could visualize those signals. So, building a faster and faster scope has been easy to demonstrate the capabilities of the scope because we've always had signals available that are much faster than the scope can measure.
 
Top