000 03664nam a22005295i 4500
001 978-3-030-27656-0
003 DE-He213
005 20240423125221.0
007 cr nn 008mamaa
008 200910s2020 sz | s |||| 0|eng d
020 _a9783030276560
_9978-3-030-27656-0
024 7 _a10.1007/978-3-030-27656-0
_2doi
050 4 _aQ337.5
050 4 _aTK7882.P3
072 7 _aUYQP
_2bicssc
072 7 _aCOM016000
_2bisacsh
072 7 _aUYQP
_2thema
082 0 4 _a006.4
_223
100 1 _aBraga-Neto, Ulisses.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aFundamentals of Pattern Recognition and Machine Learning
_h[electronic resource] /
_cby Ulisses Braga-Neto.
250 _a1st ed. 2020.
264 1 _aCham :
_bSpringer International Publishing :
_bImprint: Springer,
_c2020.
300 _aXVIII, 357 p. 84 illus., 73 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
505 0 _a1. Introduction -- 2. Optimal Classification -- 3. Sample-Based Classification -- 4. Parametric Classification -- 5. Nonparametric Classification -- 6. Function-Approximation Classification -- 7. Error Estimation for Classification -- 8. Model Selection for Classification -- 9. Dimensionality Reduction -- 10. Clustering -- 11. Regression -- Appendix.
520 _aFundamentals of Pattern Recognition and Machine Learning is designed for a one or two-semester introductory course in Pattern Recognition or Machine Learning at the graduate or advanced undergraduate level. The book combines theory and practice and is suitable to the classroom and self-study. The book is intended to be concise but thorough. It does not attempt an encyclopedic approach, but covers in significant detail the tools commonly used in pattern recognition and machine learning, including classification, dimensionality reduction, regression, and clustering, as well as recent popular topics such as Gaussian process regression and convolutional neural networks. In addition, the selection of topics has a few features that are unique among comparable texts: it contains an extensive chapter on classifier error estimation, as well as sections on Bayesian classification, Bayesian error estimation, separate sampling, and rank-based classification. The book is mathematically rigorous and covers the classical theorems in the area. Nevertheless, an effort is made in the book to strike a balance between theory and practice. In particular, examples with datasets from applications in bioinformatics and materials informatics are used throughout to illustrate the theory. These datasets are available from the book website to be used in end-of-chapter coding assignments based on python and scikit-learn. All plots in the text were generated using python scripts, which are also available on the book website.
650 0 _aPattern recognition systems.
650 0 _aComputer vision.
650 0 _aProbabilities.
650 1 4 _aAutomated Pattern Recognition.
650 2 4 _aComputer Vision.
650 2 4 _aProbability Theory.
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783030276553
776 0 8 _iPrinted edition:
_z9783030276577
776 0 8 _iPrinted edition:
_z9783030276584
856 4 0 _uhttps://doi.org/10.1007/978-3-030-27656-0
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cSPRINGER
999 _c175526
_d175526