# Re: Machines and people

```Paddy Hackett wrote:

>In view of this could anybody tell me how we get from the stage of bits
>to the letters of say the English language. In short how do bits,
>Boolean Algebra lead to letters such as a,b,c,..etc.
>
>

If I understand what you're asking ....

Computer manufacturers have gotten together and agreed on a standard (or
standards), for example, ASCII.

This standard specifies that 8 bits arranged in a certain sequence stand
for the character 'A'; another sequence stands for the character 'a'.
Still another sequence stands for the character (but not number) '5'.
There's nothing magic about these sequences; it's just what the computer
designers have agreed on.

As far as representing numbers (values, not chararacters), those
designers have agreed on another standard (binary representation,
generally). This is a mathematical "formula" that is easily used in
electronic circuits.

As far as representing computer instructions ("add these two numbers",
"move this number to that location", etc), the CPU designer (Intel, AMD,
etc) has their own code. For example, the code sequence '10111101' might
mean 'double the number in the AX register' (it's been a long time since
I've looked at this stuff, so I'm just making up numbers at random -
don't accept this as authoritative).

So, depending on the context, the sequence '10111101' might mean 189, or
it might mean 'double the number in the AX register', or it might mean
some ASCII character, such as 'Ɖ'.

It's important to remember that the symbol '5' is not a value of five
things; it's a _symbol_ that represents five things. Thus, the value '5'
will be stored differently in binary than the symbol '5'.

The basic answer to your question is that the bits get to be English
letters because the computer manufacturers have agreed that a certain
sequence of bits stands for a certain English letter. No magic; just
consensus.

--
K

```