Term of the Moment

LaserDisc


Look Up Another Term


Redirected from: hexidecimal

Definition: hex


(HEXadecimal) Hexadecimal means 16, and the base 16 numbering system is used as a shorthand for representing binary numbers. Each half byte (four bits) is assigned a hex digit or letter as in the following chart with its decimal and binary equivalents. Hex values are identified with an "h" or dollar sign, thus $A7, A7h and A7H all stand for hex A7. See hex chart and hex editor.

    Base
    16   10   2
    Hex  Dec  Binary
    0    0    0000
    1    1    0001
    2    2    0010
    3    3    0011
    4    4    0100
    5    5    0101
    6    6    0110
    7    7    0111
    8    8    1000
    9    9    1001
    A   10    1010
    B   11    1011
    C   12    1100
    D   13    1101
    E   14    1110
    F   15    1111



 Hex A7
  = decimal 167 (10 x 16 + 7 x 1)
       or
  = binary 10100111 (128 + 32 + 4 + 2 + 1)

 Hex A000
  = decimal 40,960 (10 x 4096)
    or
  = binary 1010000000000000 (32768+8192)


Interpreting Hex
As decimal digits increment by 10, hexadecimal digits increment by 16. Hex is a shorthand for human readability of binary code, the building blocks of digital systems. Very often, programmers have to get down to the nitty gritty and look at the actual data along with counters and other keeping-track-of mechanisms often used in programs. See binary values.