000 01392 a2200217 4500
003 IIITD
005 20240427150308.0
008 240425b |||||||| |||| 00| 0 eng d
020 _a9783031010378
040 _aIIITD
082 _a006.3
_bGOL-N
100 _aGoldberg, Yoav
245 _aNeural network methods in natural language processing
_cby Yoav Goldberg
260 _aNew york :
_bSpringer,
_c©2022
300 _a287 p. :
_bcol. ill. ;
_c23 cm.
440 _aSynthesis Lectures on Human Language Technologies
505 _tLearning Basics and Linear Models From Linear Models to Multi-layer Perceptrons Feed-forward Neural Networks Neural Network Training Features for Textual Data Case Studies of NLP Features From Textual Features to Inputs Language Modeling Pre-trained Word Representations Using Word Embeddings Case Study: A Feed-forward Architecture for Sentence Case Study: A Feed-forward Architecture for Sentence Meaning Inference Ngram Detectors: Convolutional Neural Networks Recurrent Neural Networks: Modeling Sequences and Stacks Concrete Recurrent Neural Network Architectures Modeling with Recurrent Networks Conditioned Generation Modeling Trees with Recursive Neural Networks Structured Output Prediction Cascaded, Multi-task and Semi-supervised Learning
650 _aArtificial intelligence
650 _aComputational linguistics
942 _2ddc
_cBK
999 _c172591
_d172591