Issues

 / 

1999

 / 

April

  

Methodological notes


Entropy and information of open systems


Lomonosov Moscow State University, Faculty of Physics, Leninskie Gory 1 build. 2, Moscow, 119991, Russian Federation

Of the two definitions of ’information’ given by Shannon and employed in the communication theory, one is identical to that of Boltzmann’s entropy and gives in fact a measure of statistical uncertainty. The other involves the difference of unconditional and conditional entropies and, if properly specified, allows the introduction of a measure of information for an open system depending on the values of the system’s control parameters. Two classes of systems are identified. For those in the first class, an equilibrium state is possible and the law of conversation of information and entropy holds. When at equilibrium, such systems have zero information and maximum entropy. In self-organization processes, information increases away from the equilibrium state. For the systems of the other class, the equilibrium state is impossible. For these, the so-called ’chaoticity norm’ is introduced and also two kinds of self-organization processes are considered and the concept of information is appropriately defined. Common information definitions are applied to classical and quantum physical systems as well as to medical and biological systems.

Fulltext pdf (517 KB)
Fulltext is also available at DOI: 10.1070/PU1999v042n04ABEH000568
PACS: 03.65.Bz, 03.67.−a, 05.65.+c, 89.70.+c (all)
DOI: 10.1070/PU1999v042n04ABEH000568
URL: https://ufn.ru/en/articles/1999/4/e/
000080487700005
Citation: Klimontovich Yu L "Entropy and information of open systems" Phys. Usp. 42 375–384 (1999)
BibTexBibNote ® (generic)BibNote ® (RIS)MedlineRefWorks

Оригинал: Климонтович Ю Л «Энтропия и информация открытых систем» УФН 169 443–452 (1999); DOI: 10.3367/UFNr.0169.199904e.0443

© 1918–2024 Uspekhi Fizicheskikh Nauk
Email: ufn@ufn.ru Editorial office contacts About the journal Terms and conditions