000 09576nam a22005415i 4500
001 978-3-030-82184-5
003 DE-He213
005 20240423125454.0
007 cr nn 008mamaa
008 211111s2021 sz | s |||| 0|eng d
020 _a9783030821845
_9978-3-030-82184-5
024 7 _a10.1007/978-3-030-82184-5
_2doi
050 4 _aR858-859.7
072 7 _aMBG
_2bicssc
072 7 _aUB
_2bicssc
072 7 _aMED117000
_2bisacsh
072 7 _aUXT
_2thema
082 0 4 _a610,285
_223
100 1 _aXiao, Cao.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aIntroduction to Deep Learning for Healthcare
_h[electronic resource] /
_cby Cao Xiao, Jimeng Sun.
250 _a1st ed. 2021.
264 1 _aCham :
_bSpringer International Publishing :
_bImprint: Springer,
_c2021.
300 _aXI, 232 p. 1 illus.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
505 0 _aI Introduction -- I.1 Who should read this book? -- I.2 Book organization -- II Health Data -- II.1 The growth of EHR Adoption -- II.2 Health Data -- II.2.1 Life cycle of health data -- II.2.2 Structured Health Data -- II.2.3 Unstructured clinical notes -- II.2.4 Continuous signals -- II.2.5 Medical Imaging Data -- II.2.6 Biomedical data for in silico drug Discovery -- II.3 Health Data Standards -- III Machine Learning Basics -- III.1 Supervised Learning -- III.1.1 Logistic Regression -- III.1.2 Softmax Regression -- III.1.3 Gradient Descent -- III.1.4 Stochastic and Minibatch Gradient Descent -- III.2 Unsupervised Learning -- III.2.1 Principal component analysis -- III.2.2 t-distributed stochastic neighbor embedding (t-SNE) -- III.2.3 Clustering -- III.3 Assessing Model Performance -- III.3.1 Evaluation Metrics for Regression Tasks -- III.3.2 Evaluation Metrics for Classification Tasks -- III.3.3 Evaluation Metrics for Clustering Tasks -- III.3.4 Evaluation Strategy -- III.4 Modeling Exercise -- III.5 Hands-On Practice -- 3 -- 4 CONTENTS -- IVDeep Neural Networks (DNN) -- IV.1 A Single neuron -- IV.1.1 Activation function -- IV.1.2 Loss Function -- IV.1.3 Train a single neuron -- IV.2 Multilayer Neural Network -- IV.2.1 Network Representation -- IV.2.2 Train a Multilayer Neural Network -- IV.2.3 Summary of the Backpropagation Algorithm -- IV.2.4 Parameters and Hyper-parameters -- IV.3 Readmission Prediction from EHR Data with DNN -- IV.4 DNN for Drug Property Prediction -- V Embedding -- V.1 Overview -- V.2 Word2Vec -- V.2.1 Idea and Formulation of Word2Vec -- V.2.2 Healthcare application of Word2Vec -- V.3 Med2Vec: two-level embedding for EHR -- V.3.1 Med2Vec Method -- V.4 MiME: Embed Internal Structure -- V.4.1 Notations of MIME -- V.4.2 Description of MIME -- V.4.3 Experiment results of MIME -- VI Convolutional Neural Networks (CNN) -- VI.1 CNN intuition -- VI.2 Architecture of CNN -- VI.2.1 Convolution layer - 1D -- VI.2.2 Convolution layer - 2D -- VI.2.3 Pooling Layer -- VI.2.4 Fully Connected Layer -- VI.3 Backpropagation Algorithm in CNN* -- VI.3.1 Forward and Backward Computation for 1-D Data -- VI.3.2 Forward Computation and Backpropagation for 2-D Convolution -- Layer . -- VI.3.3 Special CNN Architecture -- VI.4 Healthcare Applications -- VI.5 Automated surveillance of cranial images for acute neurologic events -- VI.6 Detection of Lymph Node Metastases from Pathology Images -- VI.7 Cardiologist-level arrhythmia detection and classification in ambulatory -- ECG -- CONTENTS 5 -- VIIRecurrent Neural Networks (RNN) -- VII.1Basic Concepts and Notations -- VII.2Backpropagation Through Time (BPTT) algorithm -- VII.2.1Forward Pass -- VII.2.2 Backward Pass -- VII.3RNN Variants -- VII.3.1 Long Short-Term Memory (LSTM) -- VII.3.2 Gated Recurrent Unit (GRU) -- VII.3.3 Bidirectional RNN -- VII.3.4 Encoder-Decoder Sequence-to-Sequence Models -- VII.4Case Study: Early detection of heart failure -- VII.5Case Study: Sequential clinical event prediction -- VII.6Case Study: De-identification of Clinical Notes -- VII.7Case Study:Automatic Detection of Heart Disease from electrocardiography -- (ECG) Data -- VIIAIutoencoders (AE) -- VIII.1Overview -- VIII.2Autoencoders -- VIII.3Sparse Autoencoders -- VIII.4Stacked Autoencoders -- VIII.5Denoising Autoencoders -- VIII.6Case Study: “Deep Patient” via stacked denoising autoencoders -- VIII.7Case Study: Learning from Noisy, Sparse, and Irregular Clinical -- data -- IX Attention Models -- IX.1 Overview -- IX.2 Attention Mechanism -- IX.2.1 Attention based on Encoder-Decoder RNN Models -- IX.2.2 Case Study: Attention Model over Longitudinal EHR -- IX.2.3 Case Study: Attention model over a Medical Ontology -- IX.2.4 Case Study: ICD Classification from Clinical Notes -- X Memory Networks -- X.1 Original Memory Networks -- X.2 End-to-end Memory Networks -- X.3 Case Study: Medication Recommendation -- X.4 EEG-RelNet: Memory Derived from Data -- X.5 Incorporate Memory from Unstructured Knowledge Base -- XIGraph Neural Networks -- XI.1 Overview -- XI.2 Graph Convolutional Networks -- XI.2.1 Basic Setting of GCN -- XI.2.2 Spatial Convolution on Graphs -- 6 CONTENTS -- XI.2.3 Spectral Convolution on Graphs -- XI.2.4 Approximate Graph Convolution -- XI.2.5 Neighborhood Aggregation -- XI.3 Neural Fingerprinting: Drug Molecule Embedding with GCN -- XI.4 Decagon: Modeling Polypharmacy Side Effects with GCN -- XI.5 Case Study: Multiview Drug-drug Interaction Prediction -- XIIGenerative Models -- XII.1Generative adversarial networks (GAN) -- XII.1.1 The GAN Framework -- XII.1.2 The Cost Function of Discriminator -- XII.1.3 The Cost Function of Generator -- XII.2Variational Autoencoders (VAE) -- XII.2.1 Latent Variable Models -- XII.2.2Objective Formulation -- XII.2.3Objective Approximation -- XII.2.4 Reparameterization Trick -- XII.3Case Study: Generating Patient Records -- XII.4Case Study: Small Molecule Generation for Drug Discovery -- XII CIonclusion -- XIII.1Model Setup -- XIII.2Model Training -- XIII.3Testing and Performance Evaluation -- XIII.4Result Visualization -- XIII.5Case Studies -- XIVAppendix -- XIV.1Regularization* -- XIV.1.1Vanishing or Exploding Gradient Problem -- XIV.1.2Dropout -- XIV.1.3Batch normalization -- XIV.2Stochastic Gradient Descent and Minibatch gradient descent* -- XIV.3Advanced optimization* -- XIV.3.1Momentum -- XIV.3.2Adagrad, Adadelta, and RMSprop -- XIV.3.3Adam.-.
520 _aThis textbook presents deep learning models and their healthcare applications. It focuses on rich health data and deep learning models that can effectively model health data. Healthcare data: Among all healthcare technologies, electronic health records (EHRs) had vast adoption and a significant impact on healthcare delivery in recent years. One crucial benefit of EHRs is to capture all the patient encounters with rich multi-modality data. Healthcare data include both structured and unstructured information. Structured data include various medical codes for diagnoses and procedures, lab results, and medication information. Unstructured data contain 1) clinical notes as text, 2) medical imaging data such as X-rays, echocardiogram, and magnetic resonance imaging (MRI), and 3) time-series data such as the electrocardiogram (ECG) and electroencephalogram (EEG). Beyond the data collected during clinical visits, patient self-generated/reported data start to grow thanks to wearable sensors’increasing use. The authors present deep learning case studies on all data described. Deep learning models: Neural network models are a class of machine learning methods with a long history. Deep learning models are neural networks of many layers, which can extract multiple levels of features from raw data. Deep learning applied to healthcare is a natural and promising direction with many initial successes. The authors cover deep neural networks, convolutional neural networks, recurrent neural networks, embedding methods, autoencoders, attention models, graph neural networks, memory networks, and generative models. It’s presented with concrete healthcare case studies such as clinical predictive modeling, readmission prediction, phenotyping, x-ray classification, ECG diagnosis, sleep monitoring, automatic diagnosis coding from clinical notes, automatic deidentification, medication recommendation, drug discovery (drug property prediction and molecule generation), and clinical trial matching. This textbook targets graduate-level students focused on deep learning methods and their healthcare applications. It can be used for the concepts of deep learning and its applications as well. Researchers working in this field will also find this book to be extremely useful and valuable for their research.
650 0 _aMedical informatics.
650 0 _aMachine learning.
650 0 _aArtificial intelligence.
650 1 4 _aHealth Informatics.
650 2 4 _aMachine Learning.
650 2 4 _aArtificial Intelligence.
700 1 _aSun, Jimeng.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783030821838
776 0 8 _iPrinted edition:
_z9783030821852
776 0 8 _iPrinted edition:
_z9783030821869
856 4 0 _uhttps://doi.org/10.1007/978-3-030-82184-5
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cSPRINGER
999 _c178296
_d178296