binary code

0div0

Senior member
Mar 18, 2000
262
0
0
I'm just starting computer programming class and we were talking about binary code. How exactly does the computer know the difference between a number and a letter since every letter is assigned a certain number(ASCII right?)? My teacher couldn't answer but I know someone on here will know
 

br0wn

Senior member
Jun 22, 2000
572
0
0
computer doesn't know the difference.

It only see binary (execute binary), if thats what
you mean.

If you are talking about computer languages, every
number and letter will be eventually converted
to assembly code by compiler, and this assembly
code will be converted into machine code or binary
by assembler (sometimes this is part of compiler).

 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
I'm not entirely clear on what you mean.

Do you mean, that since everything a computer deals with, be it numbers, text or graphics, is represented by a series of numbers; how does it tell the difference?

The answer is by the context in which it is used.

Take the following simple program for example:

Let A = 10
Let B = "ABC"

In this case the computer stores 10 in A, and also stores the fact that A is storing a number.

The computer stores the numbers 65, 66, 67 in B, and stores that B is a series of letters.

This way, if you ask the computer to display the contents of A or B, it can display it correctly.
 

konichiwa

Lifer
Oct 9, 1999
15,077
2
0
I think what you're missing is that ASCII is the middle-man so to speak. The actual processor only understands Binary, 1s and 0s. Every number and letter and graphic and movie and anything else on your computer is made up of a bunch of 1s and 0s (for on and off)

ASCII is a translator, per se, between binary and standard text. Each character is assigned an ASCII value which is then converted into binary for the computer to understand.
 

br0wn

Senior member
Jun 22, 2000
572
0
0
ASCII convention uses 8-bit bytes to represent characters.

letters :
A = 65 (in ASCII) = 0100 0001 (in binary)
B = 66 (in ASCII) = 0100 0010 (in binary)
...
Z = 106 (in ASCII) = 0110 1010 (in binary)
...
a = 113 (in ASCII) = 0111 0001 (in binary)
b = 114 (in ASCII) = 0111 0010 (in binary)
...
z = 122 (in ASCII) = 0111 1010 (in binary)

numbers :
0 = 48 (in ASCII) = 0011 0000 (in binary)
1 = 49 (in ASCII) = 0011 0001 (in binary)
...
9 = 57 (in ASCII) = 0011 1001 (in binary)

So all letters and numbers are characters, and they
will be converted into binary (which is what the computer
understand) by ASCII convention.

 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
"How exactly does the computer know the difference between a number and a letter since every letter is assigned a certain number(ASCII right?)?"

he's asking how the computer differentiates between say the number 9 that you type on your keyboard, and it's alpabetical equivalent.

I've often wondered the same thing. none of these answers really answered his question it seems.. maybe you read over the question too quickly?
 

etech

Lifer
Oct 9, 1999
10,597
0
0
Mark R answered the question.
It's dependent on the context, program code. I'm going back quite a few years to 6802 assembly so forgive any errors in detail and the codes are just made up, I'm too lazy to go find the books.

program
55 program code -print next byte
65 Ascii code for A prints an A on the printer
34 program code - jump to address next two bytes
65 high byte of address
00 low byte of address
23 get keyboard,B


made up code 55 tell the processor to print the next two characters. In this case a 65, the printer would print an A
made up code 34 tells the processor to jump to program addrss 6500. Notice the 65 is now treated as a number.

made up code 23, will get the next character out of the keyboard buffer and store it in register B, if you type a A on the keyboard a 65 will be stored in register B.

Before anyone jumps on it, I know that assembly did not have a print code and there are some other errors but it makes for a quick and rough example to get the idea across.





 

br0wn

Senior member
Jun 22, 2000
572
0
0
Soccerman,

the Operating System (which in turn is a program itself)
will take care of the difference between numbers and letters
if that what you mean.

Computer only sees machine code or binary code.

All numbers and letters will be converted into binary.

So the string "ABCD", is represented by 4 8-bit bytes + 1
byte for null to indicate end of string.

"ABCD" is 0100 0001 0100 0010 0100 0011 0100 0100 0011 0000
This binary code will be stored in disk or memory.
Whenever it is needed, it will be loaded (the OS should take
care of this), like it will load 5 8-bit bytes from location
100 (if this is the place where it was stored).

A number will be converted into binary code as well.

So the OS should take care of this difference, it knows
when something is a number and when another thing is a
letter (if it doesn't know, then the OS is dumb or the programmer
is dumb ).
For programmers, when you program, you do know when input
is a number or letter right ? (well at least you expect
it to be number or letter, otherwise you will report errors
when you compile the program). OS (is just a program) should know the difference between the two,
this information will be passed to compiler. Then compiler will
generate machine code, at this stage no informations are
stored about something is a number or not, but everything
has been already lay out nicely.

Keys in the keyboard will eventually be mapped into binary
code, or they might already have from the beginning (I'm
not sure how keyboard works but this is my guess), since
whenever you type something in the keyboard, signals (in binary)
are sent into OS via interrupt.

Just remember that everything you type CAN'T be understand by
computer, it needs program (OS) to translate this to language
that the machine understands.

Everything you do (type) in computer will be translated into
machine codes. First OS will translate this to instructions
(assembly code), second these instructions will be translated
into binary (machine code), and machine will use this machine
code to execute/run.

Oh btw, I should make a note that these machine code are differ
from one architecture to another.
 

RossGr

Diamond Member
Jan 11, 2000
3,383
1
0
The fact is, the "computer" cannot tell the difference. It is all the same to the CPU (hardware). It is up to the software to use the CPU(hardware) to perform a set tasks. These tasks are what we call a program. It is the program that determines which sequence of 1s and 0s is a number and which is a letter. The CPU really does not "know" or even "care" what the data it transfers and maniulates finally does or means.

In fact, with C, you could store a charater as a letter, then later, pick it up and use the number, see the ASCII code responses, as a integer, adding it to another number to get something which could then be accessed as a letter again.
 

lowtech1

Diamond Member
Mar 9, 2000
4,644
1
0
Because of parity/start/end bit the processor know when a character start or end, therefor it can input/output a character or a number.

A word length can be any where between 8 (2nibles or 1byte), 16 (MSB + LSB, or 16bit word), or 32bits (32bit double word)

ASCII word length = parity bit (start/end bit, odd/even parity error correction) + 7bits = 8bits = 1byte = word length.

And, in some application additional bit "0" or nonstandard "1" bit is added to the Most Significant bits (MSB) to extend the ASCII word to 256 characters (8bits + an extra bit = 9bits)

0~9 = 00110000~00111001

A~Z = 01000001~01011010

a~z = 01100001~01111010

The entire binary range for ASCII = 00000000~01111111 (the first ?0? is use for parity/start/end bit)

As you can see ASCII word is exactly 8bit in length the processor knows where the start/end of a character/function, and since the number have its own address between 00110000~00111001 (10 decimal address) in the pool of 00000000~01111111 (128decimal address) the processor can (kind of) differentiate between a character or number.

Ps. If your teacher couldn't tell you why the puter could differentiate between a character/number, then he shouldn't be teaching programing. Or, try to see if you could switch to another class with a diff prof or another school. Unless your question weren't clear.
 

etech

Lifer
Oct 9, 1999
10,597
0
0
The parity and start end bits are not used for data storage. Normally only used in serial transmission of data.
two data formats
7 bit data - 1 start,7data, parity bit, 1 stop bits
7 bit data - 1 start,7 data, no parity, 2 stop bits
8 bit data* - no start bit, 8 data, parity, 1 stop
8 bit data - 1 start, 8 data, no parity,1 stop

and here also, both computers must be set to the same format for them to understand each other.

We could also get into Modbus(RTU and ascii), grey code, baud rates,Rs232 and 485 and really have some fun.

* I think that one is correct in that it does not have the start bit but does have the stop bit. Someone correct me if I reversed it though.
 

OneEng

Senior member
Oct 25, 1999
585
0
0
etech has given the correct answer. I will simply elaborate.

When you take an assembly class, you will learn about these things, but here is an early intro.

When any CPU has power applied to it, it wakes up and starts executing code at a fixed location called the boot vector. The format of the code is just a bunch of numbers. The Operation is the first number. People refer to this first number as the Op Code. The x86 processor has quite a lot of these op codes to do any number of things that the processor can do. You can move memory, add two numbers together, jump to a different program pointer, branch depending on something, etc, etc. Following the op code is 1 or more numbers representing the data for the op code. For instance, if the op code was a move (mov in x86 asmembly) then the data would be the addresses for the source and destination.

Some of the data may well have values that lie in the ASCII mapping range. The only way the computer knows how to interpret these numbers as characters is if the op code before the data expects the data to be text. There is nothing stored in the code that explicitly says "the next byte represents a character".

The start, stop and parity bits thing is used only for transfer on serial lines and should not be included in this discussion.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |