digital computer

Discussion in 'General Electronics Chat' started by jerry05, Jun 24, 2013.

1. jerry05 Thread Starter New Member

Jun 11, 2013
28
0
how do computer convert its logic into alphabet that we can see

2. MrChips Moderator

Oct 2, 2009
12,436
3,360
Hmm, good question.

In order to answer this properly one would have to cover a lot of ground in digital electronics and computer basics.

Information is stored and manipulated on the computer using bits. A single bit can store only one piece of information, ON or OFF, for example DAY vs NIGHT.

When we put 8 bits together, there are 256 possible combinations, 2x2x2x2x2x2x2x2 = 256. Hence we can store 256 different values, such as 256 colors or 256 letters of the alphabet.

We know that there are only 26 letters of the alphabet so what else can we store? Well we have to remember there are lower and upper case letters so that makes 52. Then there are 10 numerals and a whole bunch of punctuation marks.

128 different codes are all we really need. That leaves another 128 for whatever we wish such as special graphics symbols.

But what about another languages, Greek, Cyrillic, Arabic, Chinese, Japanese, etc.?
In order to accommodate these, computer systems have resorted to 16-bit codes which can represent 65536 different symbols.

The next step is how do we go from an 8-bit code to a letter on the screen?
The letter "A" is represented by code 65.
When a computer system encounters code 65, it goes to a look-up table (similar to looking for a specific flash card) that contains the shape of the requested character. The computer then "draws" the image of the character on the screen. How the character is drawn depends of the type of display. For example, a 2x16 LCD character display would use a different scheme than that used for a computer PC LCD screen.

Ramussons likes this.
3. CVMichael Senior Member

Aug 3, 2007
416
17
In a computer absolutely everything is stored as bits... 8 bits makes a byte (0 to 255). When you need larger numbers you just put 2 bytes together, or 4 bytes, and so on....

In the old days fonts are stored as an array/table of bitmap images. So when you type the letter "A" on your keyboard, the key gets translated into number 65 in the computer. Then the computer will look into the array of images, and pick the 65'th image, and draw it where the cursor is... This is also called the ASCII table

Nowadays fonts are mathematically represented (drawn) on the display using vector graphics. This way when you change font size, or when you "zoom in" into a font, it will not look pixelated because the computer will recalculate / redraw the characters every time. This is called "True Type" fonts.
See this: http://www.howstuffworks.com/question460.htm

Also, latelly text is stored as Unicode. It uses 16 bits (2 bytes = 65,536 total characters) to store the information. This gives a whole lot more space for many other symbols, and for languages like Chinese, etc... See the "Character Map" application on your computer.

Last edited: Jun 24, 2013
bug13 likes this.
4. PackratKing Well-Known Member

Jul 13, 2008
850
215
... And all this at the speed of light... or close...