First Digital Phone Call

fernan82

Joined Apr 19, 2014
26
Timing is intrinsic to binary comms too.

Many IR remote binary comms use a long period for a 1 bit, and a short period for the zero bit, with an even longer pause delineating the individual bytes. The entire decoding system is based on timing.

But the data and comms are still officially binary. It doesn't become "trinary" because there are pauses between the binary data blocks.

Other standard binary data comms send a data block of 8 binary bits, separated by a pause, and data packets of X bytes separated by an even longer pause. If you include the 2 pause lengths in the base (as WBahn has done) then you would say it is now base 4 or "quaternary". But it's not, it is still binary data.

And I don't see a difference between separating bytes and packets with two different pauses, and separating letters and words with two different pauses. It does not increase the base of the data.
My opinion is that in those cases the data itself may be binary but the transmission is not unless the delays are fixed as the delays in between bits in a digital transmition or between fixed width frames. Otherwise they carry information and are part of the transmission code so it's not binary.
 

Wendy

Joined Mar 24, 2008
23,429
My opinion.

It is on / off. That is binary. Anything else is just protocols.

Working for Collins Radio I was seeing late model (at the time) analog telephone gear in the mid 80's. I know VOIP existed around then, and we had plenty of digital gear that probably carried phone channels, but technology is like culture, sometimes it takes a while for changes to occur.
 

THE_RB

Joined Feb 11, 2008
5,438
... But, as already pointed out, that clearly is not good enough for you since you are demanding that I must either chase your red herring or concede. It is obviously not acceptable to you for us to just agree to have different viewpoints. Why is that?
Of course it is acceptable for us to have different viewpoints. And that's the most reasonable thing you have said so far, because previously you insisted that I was wrong. That is not "viewpoints" that is a hardball attitude that you were right and I was wrong.

As for WHY I continued the discussion, it is because if morse is to be classified as > base2 then it has far reaching repercussions. Many modern binary comms standards have matching characteristics to morse, so from your viewpoint they must also be > base2.

So either modern binary comms are NOT binary (because they have binary data and framing elements increase the base), or morse IS binary (because it has binary data and framing elements do not increase the base).

Those are the two "viewpoints" refined down to simplicity.

NSAspook said:
...
As I said before Morse is a non-binary digital code (designed to have rhythmic timing or beat that human brains can easily follow with training) that can be represented using binary OOK (on/off keying) modulation. The fact that it can be translated into binary OOK KEYING symbols does not change the fundamental requirements needed to send information in 'Morse' CODE.
...
Now that looks like a backpedal to me. You are now saying morse is binary if it is represented by symbols, but it is not binary if it is sent with timed keying?

So all other modern binary comms that are sent using timed periods to differentiate between the binary 0 and 1 data, and periods for framing, they are no longer binary comms?

Morse has 36 possible binary data blocks (A-Z and 0-9) which are transmitted in any way we like, with zero or more framing elements to separate data blocks.

ASCII has 128 possible binary data blocks which are transmitted in any way we like, with zero or more framing elements to separate data blocks.

NSAspook said:
...
IMO those of us that see Morse as a non-binary code know the difference between coding methods and keying methods.
Coding methods;
Morse is ENCODED by substitution, using 36 binary data blocks to represent the 36 characters. It's encoding is definitely binary.

Keying methods;
Morse has been sent with different periods. It has been sent using a high tone for a 1 bit, and a low tone for a 0 bit, with the same bit period. And pauses between. It was still morse. Did that different way of keying morse change the base?

It has been sent with only 1 period length between data blocks. Only three elements; 2 data bits and framing period. Also represented in text, using 3 characters; ("-",".",framing). Did that change the base to base 3 because they only used 3 symbols? Was it still morse?

Morse is this;


Like ASCII and other binary data standards it uses only two binary symbols to represent all transmitted characters. Framing can be added to a binary comms as needed or desired. It's external to the data and external to the encoding. It is framing or formatting. It does not change the data base!

How is that encoding in the diagram above different from ASCII binary? About the only significant difference is that morse uses variable length of the data blocks. Ascii is always 7 binary bits, morse is 1 to 6 binary bits (to speed transmission).

If you think having variable data block length stops it being binary I can find some example of modern binary comms that also use variable data block length to speed transmission.

I'm not sure why people think morse has different rules than every other binary comms type. I think it's probably some hangover from analysing morse in the old days before binary comms was so widespread in so many forms and we had better defined rules.
:)
 
Last edited:

JoeJester

Joined Apr 26, 2005
4,390
Morse is ENCODED by substitution, using 36 binary data blocks to represent the 36 characters. It's encoding is definitely binary.
Your example illustrates 41 "DATA BLOCKS".

None of this is germane to the OPs question about the first digital telephone call.

We could have started with smoke signals, worked our way through radio-telegraphy, and all the secure phone schemes out there.

Was it the first "digital phone" call or the first "digital" phone call? I lost track in these five pages.

I'd go with whoever designed the rotary phone as the first "digital" phone call.
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,315
Your example illustrates 41 "DATA BLOCKS".

I'd go with whoever designed the rotary phone as the first "digital" phone call.
I had the misfortune to rebuild one of the old rotary military phone exchanges with the Strowger step by step relays for rotary phones. The mechanical logic was pretty impressive as was the amount of oil and special tools needed to keep them running at full speed.
https://www.youtube.com/watch?v=xZePwin92cI
 

Thread Starter

djsfantasi

Joined Apr 11, 2010
9,163
My understanding of AnalogKid's question was the first "digital phone" call. I.e., transmission of voice digitally. But you are right; the question is vague on that point and since it was identified as a trivia question, makes me wonder.
 

atferrari

Joined Jan 6, 2004
4,771
I had the misfortune to rebuild one of the old rotary military phone exchanges with the Strowger step by step relays for rotary phones. The mechanical logic was pretty impressive as was the amount of oil and special tools needed to keep them running at full speed.
https://www.youtube.com/watch?v=xZePwin92cI
Around 1970, I visited briefly one central here in Buenos Aires that still used the pulse system. What impressed me the most was the noise (a big central with many subscribers).

Heard that tehnicians used to carry nail files to clean contacts. It seemed a little bit crude way to me.
 
Last edited:

nsaspook

Joined Aug 27, 2009
13,315
Around 1970, I visited briefly one central here in Buenos Aires that still used the pulse system. What impressed me the most was the noise (a big central with many subscribers).

Heard that tehnicians used to carry nail files to clean contacts. It seemed a little bit crude way to me.
I doubt that, a nail file would ruin a wiping contact with the first use. we used a contact burnishing tool.
http://www.pkneuses.com/www.pkneuses.com/cont.htm

We had to test our rebuilt exchange with number pulse test inputs on all lines to verify the calling matrix. What a racket that made! Using digital for switching and encoding calls was a no brainer.
 

AnalogKid

Joined Aug 1, 2013
11,056
Holy Self-Clocked, Binary-Level, Pulse-Width Modulated, Return-Zero Data Stream, Batman!

The thing with Morse Code is that there is a temporal component to the data stream, basically a 3-state PWM, and it requires a phase-locked detector. The PLL that locks on to and extracts the clock and tracks the temporal component is the wetware listening to the radio. Think about designing and testing an analog PLL circuit that has a 10:1 or 15:1 capture range. Ouch.

And it wasn't a trick question. I meant a phone call as in what a phone call was decades ago: pick up the handset, signal an operator, get a connection over a wire pair, talk; and digital as in a numeric quantization of a continuous function. You know...digital.

In the 90's (1990's to be clear in this thread), Analog Devices started working analog function history into their seminars and app notes. From this we know that an analog multiplexer for telephony was patented in the late 1800's. They figured out that it had to switch faster than voice freqs or the audio went to crap, but it was decades later that Claude did the math.

What started all of this was stuff I picked up from one of the Analog Devices blurbs, that PCM was patented in the 20's, and the first digital phone call was between Roosevelt and Churchill, and it was some form of PCM.

ak
 

JoeJester

Joined Apr 26, 2005
4,390
What started all of this was stuff I picked up from one of the Analog Devices blurbs, that PCM was patented in the 20's, and the first digital phone call was between Roosevelt and Churchill, and it was some form of PCM.
wiki supports NSASpooks assertion

The first transmission of speech by digital techniques was the SIGSALY encryption equipment used for high-level Allied communications during World War II. In 1943, the Bell Labs researchers who designed the SIGSALY system became aware of the use of PCM binary coding as already proposed by Alec Reeves. In 1949 for the Canadian Navy's DATAR system, Ferranti Canada built a working PCM radio system that was able to transmit digitized radar data over long distances.[8]
 

fernan82

Joined Apr 19, 2014
26
There is an earlier patent for wireless (so does not count as a valid answer) that uses an early version of spread spectrum. See Wikipedia article on Austrian/American actress Hedy Lamarr.

She developed a frequency hopping protocol to control torpedoes.
She did not developed a protocol. She had the idea of using the same technology used in Player Pianos (basically the the same technology used in Music Playing Jewelry Boxes) to toggle switches for changing frequencies and the filled a patent. I'm not sure it was ever used. It also wasn't digital (though technically it can be used in digital systems). Btw. She was my neighbor for a while.
 
Top