000 03241nam a22005535i 4500
001 978-981-99-1790-7
003 DE-He213
005 20240423130059.0
007 cr nn 008mamaa
008 230914s2023 si | s |||| 0|eng d
020 _a9789819917907
_9978-981-99-1790-7
024 7 _a10.1007/978-981-99-1790-7
_2doi
050 4 _aQA76.9.D35
050 4 _aQ350-390
072 7 _aUMB
_2bicssc
072 7 _aGPF
_2bicssc
072 7 _aCOM021000
_2bisacsh
072 7 _aUMB
_2thema
072 7 _aGPF
_2thema
082 0 4 _a005.73
_223
082 0 4 _a003.54
_223
100 1 _aYamanishi, Kenji.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aLearning with the Minimum Description Length Principle
_h[electronic resource] /
_cby Kenji Yamanishi.
250 _a1st ed. 2023.
264 1 _aSingapore :
_bSpringer Nature Singapore :
_bImprint: Springer,
_c2023.
300 _aXX, 339 p. 51 illus., 48 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
505 0 _aInformation and Coding -- Parameter Estimation -- Model Selection -- Latent Variable Model Selection -- Sequential Prediction -- MDL Change Detection -- Continuous Model Selection -- Extension of Stochastic Complexity -- Mathematical Preliminaries.
520 _aThis book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning. The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.
650 0 _aData structures (Computer science).
650 0 _aInformation theory.
650 0 _aMachine learning.
650 1 4 _aData Structures and Information Theory.
650 2 4 _aMachine Learning.
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9789819917891
776 0 8 _iPrinted edition:
_z9789819917914
776 0 8 _iPrinted edition:
_z9789819917921
856 4 0 _uhttps://doi.org/10.1007/978-981-99-1790-7
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cSPRINGER
999 _c184831
_d184831