Learning with the Minimum Description Length Principle

Yamanishi, Kenji.

Learning with the Minimum Description Length Principle [electronic resource] / by Kenji Yamanishi. - 1st ed. 2023. - XX, 339 p. 51 illus., 48 illus. in color. online resource.

Information and Coding -- Parameter Estimation -- Model Selection -- Latent Variable Model Selection -- Sequential Prediction -- MDL Change Detection -- Continuous Model Selection -- Extension of Stochastic Complexity -- Mathematical Preliminaries.

This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning. The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.

9789819917907

10.1007/978-981-99-1790-7 doi


Data structures (Computer science).
Information theory.
Machine learning.
Data Structures and Information Theory.
Machine Learning.

QA76.9.D35 Q350-390

005.73 003.54
© 2024 IIIT-Delhi, library@iiitd.ac.in