when did the computer become digital computer?

Thread Starter

Werapon Pat

Joined Jan 14, 2018
35
so as I know the computer has been invented using these technologies --> relay --> vacuum tube --> transistor
but when did it become digital computer? I wonder.
 

jpanhalt

Joined Jan 18, 2008
11,087
I don't think your timeline is accurate, depending on your source and how you define "computer." It can be said that a digital computer may have existed before an analogue computer (compare the abacus with the antikythera device, https://en.wikipedia.org/wiki/Computer). Of course, terms such as computer versus calculator need to be carefully defined. How do you distinguish between a calculator and a computer?

Have you tried Google to get your answer?
 

WBahn

Joined Mar 31, 2012
29,979
so as I know the computer has been invented using these technologies --> relay --> vacuum tube --> transistor
but when did it become digital computer? I wonder.
You need to carefully define what you mean by "digital", "computer", and "digital computer" (noting that the third does not necessarily follow from simply combining the first two).

The first is fairly straightforward as it usually involves distinguishing between working with discrete quantities versus continuous quantities. But the second gets really messy really quickly. What does and does not qualify as a "computer" for your purposes? The original use of the term "computer", after all, referred to humans that performed computations. Then it referred to either humans or machines that did so, and later just to machines. So when talking about the history of computers, you need to be clear whether you mean to apply a contemporary definition, or a modern definition. Similarly, does a computer need to be programmable to satisfy your inquiry, or merely to carry out an algorithm?
 

MaxHeadRoom

Joined Jul 18, 2013
28,619
One who contributed to the digital feature is the Englishman George Boole, whose algebra of logic, now called Boolean algebra, is basic to the design of digital computer circuits.
Unfortunately there was not a great deal of use of it until after his death and the creation of the modern computer systems.
Max.
 

atferrari

Joined Jan 6, 2004
4,764
One who contributed to the digital feature is the Englishman George Boole, whose algebra of logic, now called Boolean algebra, is basic to the design of digital computer circuits.
Unfortunately there was not a great deal of use of it until after his death and the creation of the modern computer systems.
Max.
IIRC, it was used in the impmementations of phone exchanges by circa 1870.
 

BR-549

Joined Sep 22, 2013
4,928
Both came together. It totally depends on your measuring device or unit. If your measuring reference is discrete.....like a grain of sand or a drop of water.......the output will be an integral of that discreetness.

But if your input can be continuously variable......like the length of a string or part of a rotation......then the output will be continuously variable and not discrete.

It's not when it came....it's how you want to calculate. Watching a sundial shadow is not digital. Sand in an hour glass is.
 

cmartinez

Joined Jan 17, 2007
8,220
If you like to read books of fiction with an element of historical truth in them, I suggest you take a look at Cryptonomicon, by Neal Stephenson.

It's one of the greatest epic computer adventures ever written. The book is quite enjoyable, if a bit outdated.
 

bogosort

Joined Sep 24, 2011
696
One who contributed to the digital feature is the Englishman George Boole, whose algebra of logic, now called Boolean algebra, is basic to the design of digital computer circuits.
Unfortunately there was not a great deal of use of it until after his death and the creation of the modern computer systems.
Max.
Boole designed his system as a calculus of logic, what today we call propositional, or zeroth-order, logic. At the time (mid 19th century), it was notable because it was the first fleshed-out algorithmic system of logical thought. Though Leibniz had played around with similar ideas, he never took it far enough to see its true implications. Up until Boole, logic was decidedly Aristotelian: a qualitative set of syllogisms rather than a formal system of thought. Though Boole's presentation was somewhat controversial at the time, the potential for "mechanizing truth" was clear to everyone, even if they lacked any notion of the computers of the far future. In short, Boole's system revolutionized and revitalized logic by bringing it under the scope of mathematics. In turn, mathematics was revolutionized through the introduction of formal logic and set theory.

Interestingly, though, Boole's system wasn't an algebra (in the technical sense of a vector space with multiplication of vectors). It wasn't even a well-defined structure: Boole didn't provide any axioms; he didn't believe in idempotency (i.e., xx = x), the hallmark of what today we call boolean algebra; and he used several mathematically and logically nonsensical constructions, such as x/y. Today, of course, the boolean algebra is well-known and well-defined, though Boole wouldn't recognize much of it. We now know that there are actually an infinite number of boolean algebras, vastly different-looking, but all isomorphic to the canonical algebra ({0, 1}, ∧, ∨, ~), the set of integers modulo 2 with the standard axioms. For example, for some set S, the set of all subsets of S -- i.e., the powerset of S -- is a boolean algebra, with the empty set as zero and S as the unit. As another example, every square-free number -- i.e., positive integers that are the product of distinct primes -- forms a boolean algebra with the operations of greatest-common-divisor representing ∧ ("meet") and least-common-multiple representing ∨ ("join").

Boolean algebras are surprisingly fertile mathematical structures: they are intimately related to boolean rings (with "XOR addition" replacing "OR addition"). If we impose a partial-ordering, we can form a boolean lattice. And of course boolean circuits have tremendous importance in information theory and complexity theory, to say nothing of their role in digital computing!
 

MaxHeadRoom

Joined Jul 18, 2013
28,619
Boole designed his system as a calculus of logic,........etc etc ............, to say nothing of their role in digital computing!
I Did not know that!:)

It always surprises me how so many of the most well known names and innovators of the past had no formal education or training, Boole, Faraday, John Harrison etc, etc.
And were basically self-taught.
Max.
 
Last edited:
Top