000 04128nam a22005655i 4500
001 978-3-030-93158-2
003 DE-He213
005 20240423125530.0
007 cr nn 008mamaa
008 220218s2022 sz | s |||| 0|eng d
020 _a9783030931582
_9978-3-030-93158-2
024 7 _a10.1007/978-3-030-93158-2
_2doi
050 4 _aQ334-342
050 4 _aTA347.A78
072 7 _aUYQ
_2bicssc
072 7 _aCOM004000
_2bisacsh
072 7 _aUYQ
_2thema
082 0 4 _a006.3
_223
100 1 _aTomczak, Jakub M.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aDeep Generative Modeling
_h[electronic resource] /
_cby Jakub M. Tomczak.
250 _a1st ed. 2022.
264 1 _aCham :
_bSpringer International Publishing :
_bImprint: Springer,
_c2022.
300 _aXVIII, 197 p. 127 illus., 122 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
505 0 _aWhy Deep Generative Modeling? -- Autoregressive Models -- Flow-based Models -- Latent Variable Models -- Hybrid Modeling -- Energy-based Models -- Generative Adversarial Networks -- Deep Generative Modeling for Neural Compression -- Useful Facts from Algebra and Calculus -- Useful Facts from Probability Theory and Statistics -- Index.
520 _aThis textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep" comes from the fact that the distribution is parameterized using deep neural networks. There are two distinct traits of deep generative modeling. First, the application of deep neural networks allows rich and flexible parameterization of distributions. Second, the principled manner of modeling stochastic dependencies using probability theory ensures rigorous formulation and prevents potential flaws in reasoning. Moreover, probability theory provides a unified framework where the likelihood function plays a crucial role in quantifying uncertainty and defining objective functions. Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github.
650 0 _aArtificial intelligence.
650 0 _aMachine learning.
650 0 _aComputer science
_xMathematics.
650 0 _aMathematical statistics.
650 0 _aComputer simulation.
650 1 4 _aArtificial Intelligence.
650 2 4 _aMachine Learning.
650 2 4 _aProbability and Statistics in Computer Science.
650 2 4 _aComputer Modelling.
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783030931575
776 0 8 _iPrinted edition:
_z9783030931599
776 0 8 _iPrinted edition:
_z9783030931605
856 4 0 _uhttps://doi.org/10.1007/978-3-030-93158-2
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cSPRINGER
999 _c178972
_d178972