We present some new results that relate information to chaotic
dynamics. In our approach the quantity of information is measured by the
Algorithmic Information Content (Kolmogorov complexity) or by a sort of
computable version of it (Computable Information Content) in which the information
is measured by using a suitable universal data compression algorithm.
We apply these notions to the study of dynamical systems by considering the
asymptotic behavior of the quantity of information necessary to describe their
orbits. When a system is ergodic, this method provides an indicator that equals
the Kolmogorov-Sinai entropy almost everywhere. Moreover, if the entropy is
null, our method gives new indicators that measure the unpredictability of the
system and allows various kind of weak chaos to be classified. Actually, this
is the main motivation of this work. The behavior of a 0-entropy dynamical
system is far to be completely predictable except that in particular cases.
In fact there are 0-entropy systems that exhibit a sort of weak chaos, where
the information necessary to describe the orbit behavior increases with time
more than logarithmically (periodic case) even if less than linearly (positive
entropy case). Also, we believe that the above method is useful to classify
0-entropy time series. To support this point of view, we show some theoretical
and experimental results in specific cases.
Mathematics Subject Classification: 37M25.