Entropy (information theory): Difference between revisions

Content deleted Content added
Line 185:
The inspiration for adopting the word ''entropy'' in information theory came from the close resemblance between Shannon's formula and very similar known formulae from [[statistical mechanics]].
 
In [[statistical thermodynamics]] the most general formula for the thermodynamic [[entropy]] {{math|''S''}} of a [[thermodynamic system]] is the [[Gibbs entropy]],
:<math>S = - k_\text{B} \sum p_i \ln p_i \,,</math>
where {{math|''k''<sub>B</sub>}} is the [[Boltzmann constant]], and {{math|''p''<sub>''i''</sub>}} is the probability of a [[Microstate (statistical mechanics)|microstate]]. The [[Entropy (statistical thermodynamics)|Gibbs entropy]] was defined by [[J. Willard Gibbs]] in 1878 after earlier work by [[Ludwig Boltzmann|Boltzmann]] (1872).<ref>Compare: Boltzmann, Ludwig (1896, 1898). Vorlesungen über Gastheorie : 2 Volumes – Leipzig 1895/98 UB: O 5262-6. English version: Lectures on gas theory. Translated by Stephen G. Brush (1964) Berkeley: University of California Press; (1995) New York: Dover {{isbn|0-486-68455-5}}</ref>