(
Binary
Coded
Decimal) The storage of numbers in the computer whereby each decimal digit is converted into a binary number and stored in a single 8-bit byte. For example, a 12-digit decimal number would be take 12 bytes. BCD uses more storage for numbers than binary encoding (see below). Prior to 8-bit bytes in the 1960s, BCD encoding used 6-bit characters. See
binary numbers and
byte.
BCD and Binary
The BCD method codes each decimal digit in binary and stores it in its own byte. The binary method converts the entire decimal number into a binary number. In the binary example above, the 1 in the left byte represents the value of 256 because it is the ninth bit from the right (256 - 128 64 32 16 8 4 2 1).
How Numbers Are Stored
BCD is one of four primary ways numbers are stored in the computer.