|
||||||||||||||||||
Entropy and information of open systemsLomonosov Moscow State University, Faculty of Physics, Leninskie Gory 1 build. 2, Moscow, 119991, Russian Federation Of the two definitions of ’information’ given by Shannon and employed in the communication theory, one is identical to that of Boltzmann’s entropy and gives in fact a measure of statistical uncertainty. The other involves the difference of unconditional and conditional entropies and, if properly specified, allows the introduction of a measure of information for an open system depending on the values of the system’s control parameters. Two classes of systems are identified. For those in the first class, an equilibrium state is possible and the law of conversation of information and entropy holds. When at equilibrium, such systems have zero information and maximum entropy. In self-organization processes, information increases away from the equilibrium state. For the systems of the other class, the equilibrium state is impossible. For these, the so-called ’chaoticity norm’ is introduced and also two kinds of self-organization processes are considered and the concept of information is appropriately defined. Common information definitions are applied to classical and quantum physical systems as well as to medical and biological systems.
|
||||||||||||||||||
|