Difference between revisions of "Units of information"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(External links)
(See also)
 
(One intermediate revision by the same user not shown)
Line 22: Line 22:
 
* [[Information]]
 
* [[Information]]
 
* [[Information theory]]
 
* [[Information theory]]
* [[Claude Shannon|Shannon, Claude]]
+
* [[Claude Shannon]]
  
 
== External links ==
 
== External links ==
Line 30: Line 30:
 
[[Category:Computing]]
 
[[Category:Computing]]
 
[[Category:Data]]
 
[[Category:Data]]
 +
[[Category:Data structures]]
 
[[Category:Information]]
 
[[Category:Information]]
 
[[Category:Information theory]]
 
[[Category:Information theory]]

Latest revision as of 19:18, 21 April 2016

In computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels.

Description

In information theory, units of information are also used to measure the information contents or entropy of random variables.

Bit and byte

The most common units are:

  • The bit, the capacity of a system which can exist in only two states
  • The byte (or octet), which is equivalent to eight bits

Multiples of these units can be formed from these with the SI prefixes (power-of-ten prefixes) or the newer IEC binary prefixes (binary power prefixes).

Information capacity is a dimensionless quantity, because it refers to a count of binary symbols.

See also

External links