# Definition: hex

(HEXadecimal) Meaning 16, hex is shorthand for eight-bit binary characters, or bytes, which is the common representation of storage and memory.

Each half byte (four bits) is assigned a hex digit or letter as in the following chart. Hex values are identified with an "h" or dollar sign, thus \$A7, A7h and A7H all stand for hex A7. Hex is Greek for six; hence, hexadecimal means six and ten. See byte, octal, hex chart and hex editor.
```
Base 2   Base 16  Base 10
Binary     Hex      Dec

0000       0        0
0001       1        1
0010       2        2
0011       3        3
0100       4        4
0101       5        5
0110       6        6
0111       7        7
1000       8        8
1001       9        9
1010       A       10
1011       B       11
1100       C       12
1101       D       13
1110       E       14
1111       F       15
```

``` Hex A7
= decimal 167 (10 x 16 + 7 x 1)
or
= binary 10100111 (128 + 32 + 4 + 2 + 1)

Hex A000
= decimal 40,960 (10 x 4096)
or
= binary 1010000000000000 (32768+8192)
```

Interpreting Hex
As decimal digits increment by 10, hexadecimal digits increment by 16. Hex is a shorthand for human readability of binary code, the building blocks of digital systems. Very often, programmers have to get down to the nitty gritty and look at the actual data along with counters and other keeping-track-of mechanisms often used in programs. See binary values.