Schedule Oct 12, 2001
Predictability, Complexity and Learning
Dr. Ilya Nemenman, ITP
We define predictive information I_pred (T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: I_pred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then I_pred(T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, power--law growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictiveinformation and measures of complexity that have been defined both in learning theory and in the analysis of physical systems through statistical mechanics and dynamical systems theory. Further, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of I_pred(T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in different problems in physics, statistics, and biology.

Audio for this talk requires sound hardware, and RealPlayer or RealAudio by RealNetworks.

Begin WebCam and audio for the whole talk: high bandwidth or medium bandwidth.

Or, begin audio only for the whole talk: high bandwidth or low bandwidth. (Or, right-click to download the whole audio file.)

To begin viewing slides, click on the first slide below. (Or, view as pdf.)

[01] [02] [03] [04] [05] [06] [07] [08] [09] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24]

Author entry (protected)