Hostname: page-component-848d4c4894-wzw2p Total loading time: 0 Render date: 2024-05-31T07:18:37.598Z Has data issue: false hasContentIssue false

Why the Shannon and Hartley entropies are ‘natural’

Published online by Cambridge University Press:  01 July 2016

J. Aczél
Affiliation:
University of Waterloo, Ontario
B. Forte
Affiliation:
University of Waterloo, Ontario
C. T. Ng
Affiliation:
University of Waterloo, Ontario

Abstract

The following properties of entropies, as measures of expected information, seem natural. The amount of information expected from an experiment does not change if we add outcomes of zero probability (expansibility). The expected information is symmetric in the (probabilities of the) outcomes. The information expected from a combination of two experiments is less than or equal to the sum of the informations expected from the single experiments (subadditivity); equality holds here if the two experiments are independent (additivity).

In this paper it is shown that linear combinations of the Shannon and Hartley entropies and only these have the above properties. The Shannon and the Hartley entropies are also individually characterized.

Type
Research Article
Copyright
Copyright © Applied Probability Trust 1974 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aczél, J. (1964) Some unsolved problems in the theory of functional equations. Arch. Math. (Basel) 15, 435444.Google Scholar
Aczél, J. (1968) On different characterizations of entropies. Probability and Information Theory. Proc. Internat. Symp. McMaster Univ., Canada, 1968. Springer-Verlag, Berlin, Heidelberg and New York. pp. 111.Google Scholar
Aczél, J. (1970) Problems 6 (P51, P52, P53, P54, P51S1). Aequationes Math. 4, 242243.Google Scholar
Aczél, J. and Daróczy, Z. (1963) Sur la caractérisation axiomatique des entropies d'ordre positif, y comprise l'entropie de Shannon. C. R. Acad. Sci. Paris 257, 15811584.Google Scholar
Aczél, J. and Daróczy, Z. (1974) On Measures of Information and their Characterizations. Academic Press, New York.Google Scholar
Aczél, J. and Forte, B. (1970) A system of axioms for the measure of the uncertainty. Notices Amer. Math. Soc. 17, 202.Google Scholar
Daróczy, Z. (1969) On the Shannon measure of information. (In Hungarian.) Magyar Tud. Akad. III. Oszt. Közl. 19, 924. English translation in Selected Translations in Mathematical Statistics and Probability, Vol. 10, 193–210. Inst. of Math. Stat. — Amer. Math. Soc., Providence, R. I., 1972.Google Scholar
Daróczy, Z. (1970) Generalized information functions. Information and Control 16, 3651.Google Scholar
Faddeev, D. K. (1956) On the concept of entropy of a finite probabilistic scheme. (In Russian.) Uspechi Mat. Nauk 11, No. 1 (67), 227231.Google Scholar
Forte, B. (1971) Functional inequalities in information theory. Aequationes Math. 6, 102103.Google Scholar
Forte, B. (1973) Why Shannon's entropy. Convegno Inform. Teor., Ist. Naz. Alta Mat., Roma, 1973; Symposia Math. Vol. XI, Academic Press, New York, 1974.Google Scholar
Hartley, R. V. (1928) Transmission of information. Bell System Tech. J. 7, 535563.CrossRefGoogle Scholar
Jaglom, A. M. and Jaglom, I. M. (1957–1969) Probability and Information. (In Russian.) GITTL, Moscow, 1957; 2nd ed., Fizmatgiz, Moscow, 1960. French translation, Probabilité et Information. Collection Sigma Vol. 17, Dunod, Paris, 1969.Google Scholar
Kannappan, Pl. and Ng, C. T. (1973) Measurable solutions of functional equations related to information theory. Proc. Amer. Math. Soc. 38, 303310.Google Scholar
Kátai, I. (1967) A remark on additive arithmetical functions. Ann. Univ. Sci. Budapest. Eötvös Sect. Math. 12, 8183.Google Scholar
Kendall, D. G. (1963) Functional equations in information theory. Zeit. Wahrscheinlichkeitsth. 2, 225229.CrossRefGoogle Scholar
Lee, P. M. (1964) On the axioms of information theory. Ann. Math. Statist. 35, 414418.Google Scholar
Rényi, A. (1960) On measures of entropy and information. Proc. 4th Berkeley Symp. Math. Statist. and Prob. 1960. I, 547561. Univ. of California Press, Berkeley.Google Scholar
Rényi, A. (1965) On the foundations of information theory. Rev. Int. Statist. Inst. 33, 114.Google Scholar
Shannon, C. F. and Weawer, W. (1949) The Mathematical Theory of Communication. Univ. of Illinois Press, Chicago.Google Scholar
Tverberg, H. (1958) A new derivation of the information function. Math. Scand. 6, 297298.Google Scholar