Bit

The bit is a basic unit of information in information theory, computing, and digital communications. The name is a portmanteau of binary digit.[1]

In information theory, one bit is typically defined as the information entropy of a binary random variable that is 0 or 1 with equal probability,[2] or the information that is gained when the value of such a variable becomes known.[3][4] As a unit of information, the bit has also been called a shannon,[5] named after Claude Shannon.

As a binary digit, the bit represents a logical value, having only one of two values. It may be physically implemented with a two-state device. These state values are most commonly represented as either 0or1, but other representations such as true/false, yes/no, +/−, or on/off are possible. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.

The symbol for the binary digit is either simply bit per recommendation by the IEC 80000-13:2008 standard, or the lowercase character b, as recommended by the IEEE 1541-2002 and IEEE Std 260.1-2004 standards. A group of eight binary digits is commonly called one byte, but historically the size of the byte is not strictly defined.

History

The encoding of data by discrete bits was used in the punched cards invented by Basile Bouchon and Jean-Baptiste Falcon (1732), developed by Joseph Marie Jacquard (1804), and later adopted by Semen Korsakov, Charles Babbage, Hermann Hollerith, and early computer manufacturers like IBM. Another variant of that idea was the perforated paper tape. In all those systems, the medium (card or tape) conceptually carried an array of hole positions; each position could be either punched through or not, thus carrying one bit of information. The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870).

Ralph Hartley suggested the use of a logarithmic measure of information in 1928.[6] Claude E. Shannon first used the word bit in his seminal 1948 paper A Mathematical Theory of Communication.[7] He attributed its origin to John W. Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary information digit" to simply "bit". Vannevar Bush had written in 1936 of "bits of information" that could be stored on the punched cards used in the mechanical computers of that time.[8] The first programmable computer, built by Konrad Zuse, used binary notation for numbers.

Other Languages
Afrikaans: Bis
Ænglisc: Twāling
العربية: بت
aragonés: Bit
asturianu: Bit
azərbaycanca: Bit
تۆرکجه: بیت
বাংলা: বিট
Bân-lâm-gú: Bit
беларуская: Біт
беларуская (тарашкевіца)‎: Біт
bosanski: Bit
brezhoneg: Bit
català: Bit
čeština: Bit
Cymraeg: Bit
dansk: Bit
Deutsch: Bit
eesti: Bitt
Ελληνικά: Δυαδικό ψηφίο
español: Bit
Esperanto: Bito
euskara: Bit
français: Bit
Frysk: Bit
furlan: Bit
Gaeilge: Giotán
galego: Bit
한국어: 비트
հայերեն: Բիթ
हिन्दी: द्वयंक
hrvatski: Bit
Bahasa Indonesia: Bit
interlingua: Bit
italiano: Bit
עברית: סיבית
ಕನ್ನಡ: ಬಿಟ್
ქართული: ბიტი
Кыргызча: Бит (маалымат)
ລາວ: ບິຕ
Latina: Bit
latviešu: Bits
Lëtzebuergesch: Bit
lietuvių: Bitas
lumbaart: Bit
magyar: Bit
македонски: Бит
മലയാളം: ബിറ്റ്
Malti: Bit
मराठी: बाईट
Bahasa Melayu: Bit
Mirandés: Bit
монгол: Бит
မြန်မာဘာသာ: Bit
Nederlands: Bit (eenheid)
日本語: ビット
norsk: Bit
norsk nynorsk: Bit
occitan: Bit
олык марий: Бит
پنجابی: بٹ
polski: Bit
português: Bit
Qaraqalpaqsha: Bit
română: Bit
русский: Бит
Scots: Bit
shqip: Bit
sicilianu: Bit
Simple English: Bit
سنڌي: ٻٽ
slovenčina: Bit
slovenščina: Bit
کوردی: بیت
српски / srpski: Бит (рачунарство)
srpskohrvatski / српскохрватски: Bit (informatika)
suomi: Bitti
svenska: Bit
தமிழ்: இருமம்
తెలుగు: బిట్
ไทย: บิต
тоҷикӣ: Бит
Türkçe: Bit (bilişim)
українська: Біт
Tiếng Việt: Bit
Winaray: Bit
吴语: 柲 (单位)
ייִדיש: ביט
粵語: 位元
中文: 位元