Definition
This 2.5 inch
hard drive can hold 500 GB (i.e., 500 billion bytes) of data.
The term gigabyte is commonly used to mean either 1000^{3} bytes or 1024^{3} bytes. The latter binary usage originated as compromise technical jargon for byte multiples that needed to be expressed in a power of 2, but lacked a convenient name. As 1024 (2^{10}) is approximately 1000 (10^{3}), roughly corresponding to SI multiples, it was used for binary multiples as well.
In 1998 the International Electrotechnical Commission (IEC) published standards for binary prefixes, requiring that the gigabyte strictly denote 1000^{3} bytes and gibibyte denote 1024^{3} bytes. By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST, and in 2009 it was incorporated in the International System of Quantities. Nevertheless, the term gigabyte continues to be widely used with the following two different meanings:
Base 10 (decimal)
- 1 GB = 1000000000 bytes (= 1000^{3} B = 10^{9} B)
Based on powers of 10, this definition uses the prefix giga- as defined in the International System of Units (SI). This is the recommended definition by the International Electrotechnical Commission (IEC).^{[2]} This definition is used in networking contexts and most storage media, particularly hard drives, flash-based storage,^{[3]}^{[4]} and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance. The file manager of Mac OS X version 10.6 and later versions are a notable example of this usage in software, which report files sizes in decimal units.^{[5]}
Base 2 (binary)
- 1 GiB = 1073741824 bytes (= 1024^{3} B = 2^{30} B).
The binary definition uses powers of the base 2, as does the architectural principle of binary computers.
This usage is widely promulgated by some operating systems, such as Microsoft Windows in reference to computer memory (e.g., RAM). This definition is synonymous with the unambiguous unit gibibyte.