Convergence of Discrete MDL for Sequential Prediction

From Simple Sci Wiki
Jump to navigation Jump to search

Title: Convergence of Discrete MDL for Sequential Prediction

Abstract: This research study investigates the properties of the Minimum Description Length (MDL) principle for sequence prediction, focusing on a two-part MDL estimator chosen from a countable class of models. This applies to the important case of universal sequence prediction, where the model class corresponds to all algorithms for some fixed universal Turing machine (this correspondence is established through enumerable semimeasures, resulting in stochastic models).

The researchers prove convergence theorems similar to Solomonoff's theorem on universal induction, which also holds for general Bayes mixtures. The bound characterizing the convergence speed for MDL predictions is exponentially larger compared to Bayes mixtures. They observe that there are at least three different ways of using MDL for prediction. One method has worse prediction properties, which only converge if the MDL estimator stabilizes. The researchers establish sufficient conditions for this to occur.

Finally, the study presents some immediate consequences for complexity relations and randomness criteria. The research concludes that the MDL principle, when applied to sequence prediction with countable classes of arbitrary semimeasures, provides valuable insights into the convergence and performance of prediction algorithms.

Link to Article: https://arxiv.org/abs/0404057v1 Authors: arXiv ID: 0404057v1