A contraction of the term "'bi'nary digi't'," the smallest piece of information used by a computer, consisting of either the number 1 or 0. A computer processor is essentially a large collection of transistors; information is only understood by the computer in terms of turning these transistors on or off (in other words, there being an electrical connection or there not being an electrical connection). In binary, then, "1" represents "on," while "0" represents "off." Combinations of bits form larger units which convey greater amounts of information. Eight bits is known as one byte, which is commonly the amount used to represent a single character or letter. One thousand bytes (or, more correctly, 2^{10}, or 1,024, bytes) is one kilobyte, while one million bytes (or, more correctly, 2^{20}, or 1,048,576 bytes) is one megabyte. Increasingly common computer data storage sizes are gigabytes and terabytes.

The term digital is used to refer to any information conveyed using "digit"s, in this case, the digits being 1 and 0.