How do computers convert the ASCII codes for a number into the correct Binary number

Thread Starter

ThatComputerGuy

Joined Jun 13, 2018
8
hi,

i'm currently designing a cpu in logisim, i'm far from done, but i just needed to know this.
how does a computer turn 23 (00110010 00110011 in ASCII) into the binary number for 23 (10111)
and how does this work other way around?
 

Thread Starter

ThatComputerGuy

Joined Jun 13, 2018
8
hi TCG,
Is this a College assignment.?
E
not really, i'm almost in the last year of secondary education, and i want to study something with digital logic when i'm done with that, so this is just something i do in my spare time, but i do have to do a project in the last year ( next year for me (i hope :) ), and it has to have something to do with what you want to study after secondary education, so i could also use my design for that project :)
 

MrChips

Joined Oct 2, 2009
30,720
You take one step at a time.

"2" is transmitted as 0011 0010. This is also the binary representation of the integer value 50.

First you test that the received character is within the acceptable range for numerals, i.e. from 48 to 57.
Subtract the base value 48 from the received character to get the numerical value of the digit.

You do the same for "3" which is transmitted as 0011 0011.
(And you do the same thing for every subsequent character received.)

You create a holding register R that is initialized to zero.
When you receive a character which has been validified as a numeral and converted to its numerical value as described above, you multiply R by 10 and add the new numerical value.

R will contain your final result.

The reverse is similar, done in the reverse process.
I can describe this if you are not able to do so.
 

Thread Starter

ThatComputerGuy

Joined Jun 13, 2018
8

dl324

Joined Mar 30, 2015
16,846
i'm currently designing a cpu in logisim, i'm far from done, but i just needed to know this.
how does a computer turn 23 (00110010 00110011 in ASCII) into the binary number for 23 (10111)
and how does this work other way around?
The computer doesn't do it; the program consuming the data does it.

One way to convert an ASCII character for a number to a binary number is to use atoi() (in C).

Another way is to subtract the ASCII value for zero from the value of some other number.
Code:
  char n;
  int num;
  ...
  num = n - '0';
If you want to do the conversion the other direction, you can use sprintf() to put it in a variable, or printf() to print it to stdout or fprintf() to a file.
 

MrChips

Joined Oct 2, 2009
30,720
The computer doesn't do it; the program consuming the data does it.

One way to convert an ASCII character for a number to a binary number is to use atoi() (in C).

Another way is to subtract the ASCII value for zero from the value of some other number.
Code:
  char n;
  int num;
  ...
  num = n - '0';
If you want to do the conversion the other direction, you can use sprintf() to put it in a variable, or printf() to print it to stdout or fprintf() to a file.
The problem with atoi( ) and sprintf( ) and all similar library functions is that the user may not have a clue as to what is actually going on behind the scene. It would be a much better learning exercise to be able to do each function with pen and paper and do it manually, especially if one is intending to implement this in hardware.
 

MrChips

Joined Oct 2, 2009
30,720
btw, ASCII-to-Decimal conversion and Decimal-to-ASCII conversion are classic first-month exercises in many programming courses, (low-level and high-level languages).

In our introductory programming courses, the use of library functions is prohibited.
 

Thread Starter

ThatComputerGuy

Joined Jun 13, 2018
8
You take one step at a time.

"2" is transmitted as 0011 0010. This is also the binary representation of the integer value 50.

First you test that the received character is within the acceptable range for numerals, i.e. from 48 to 57.
Subtract the base value 48 from the received character to get the numerical value of the digit.

You do the same for "3" which is transmitted as 0011 0011.
(And you do the same thing for every subsequent character received.)

You create a holding register R that is initialized to zero.
When you receive a character which has been validified as a numeral and converted to its numerical value as described above, you multiply R by 10 and add the new numerical value.

R will contain your final result.

The reverse is similar, done in the reverse process.
I can describe this if you are not able to do so.
yes, i think i need help with the reverse process, could you maybe give an example?
 

MrChips

Joined Oct 2, 2009
30,720
yes, i think i need help with the reverse process, could you maybe give an example?
In the reverse process, Decimal-to-ASCII, you need to break down the value into units of powers of 10.
There are two ways to do this, either divide repeatedly by 10, or divide by 10000, 1000, 100, 10.
In the latter case, the units are extracted in the correct order.
In the former case, the units are in reversed order. Hence you have to save the units somewhere. In ASM, one can store them on a stack.
I like the former case.

Divide by 10.
Save the remainder on a LIFO stack.
If the quotient is zero, you're done, else repeat.

Now pop the unit off the stack. Add 48 to convert to ASCII.
Repeat until all units are off the stack.
 

joeyd999

Joined Jun 6, 2011
5,237
I do it in two steps. First, binary to BCD, then BCD to ASCII.

Here's my PIC code for the first part:

Code:
;*******************************************************
;** BIN2BCD -- Convert temp2:0 to BCD in lcdbcd[3]    **
;*******************************************************

bin2bcd    clrn    lcdbcd,4    ;preclear BCD result
    movlf    bitcnt,24    ;3 byte to convert

;double current bcd value

b2blp    movf    lcdbcd,w
    addwf    lcdbcd,w
    daw
    movwf    lcdbcd

    movf    lcdbcd+1,w
    addwfc    lcdbcd+1,w
    daw
    movwf    lcdbcd+1

    movf    lcdbcd+2,w
    addwfc    lcdbcd+2,w
    daw
    movwf    lcdbcd+2

    movf    lcdbcd+3,w
    addwfc    lcdbcd+3,w
    daw
    movwf    lcdbcd+3

    rlcf    temp0,f        ;rotate out the high bit
    rlcf    temp1,f
    rlcf    temp2,f
    skpnc
    incf    lcdbcd,f    ;if 1, add it in to the bcd value
  
    djnz    bitcnt,b2blp    ;do for all bits

    return

;**********  End BIN2BCD  ***************
And the second part:

Code:
;***********************************************************
;** TXBCDA -- Transmit BCD in LCDBCD[4] to RS232 in ASCII **
;***********************************************************

txbina    lfsr    1,lcdbcd+3
    movlf    bitcnt,4

prstxlp    swapf    indf1,w
    rcall    txascii
    movfw    postdec1
    rcall    txascii

    djnz    bitcnt,prstxlp

    return

txascii    andlw    0x0f            ;mask low byte
    addlw    '0'            ;make ascii
    rcall    tx2byt            ;send ascii

    return
 

joeyd999

Joined Jun 6, 2011
5,237
Here are a couple of links to 16-bit BIN2BCD and 17-bit BIN to BCD. The latter is important, if you want full, 5-digit decimal (i.e., 99,999).
I should have pointed out that the code I posted above converts 24 bits to 8 packed BCD digits to 8 ASCII digits, up to (2^24)-1 or 16,777,215. It can be extended easily for more bits/digits.
 
Top