000 04434nam a22006015i 4500
001 978-981-97-0747-8
003 DE-He213
005 20240423130339.0
007 cr nn 008mamaa
008 240408s2024 si | s |||| 0|eng d
020 _a9789819707478
_9978-981-97-0747-8
024 7 _a10.1007/978-981-97-0747-8
_2doi
050 4 _aQA76.9.N38
072 7 _aUYQL
_2bicssc
072 7 _aCOM073000
_2bisacsh
072 7 _aUYQL
_2thema
082 0 4 _a006.35
_223
100 1 _aJiang, Meng.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
245 1 0 _aKnowledge-augmented Methods for Natural Language Processing
_h[electronic resource] /
_cby Meng Jiang, Bill Yuchen Lin, Shuohang Wang, Yichong Xu, Wenhao Yu, Chenguang Zhu.
250 _a1st ed. 2024.
264 1 _aSingapore :
_bSpringer Nature Singapore :
_bImprint: Springer,
_c2024.
300 _aIX, 95 p. 18 illus., 15 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
490 1 _aSpringerBriefs in Computer Science,
_x2191-5776
505 0 _aChapter 1. Introduction to Knowledge-augmented NLP -- Chapter 2. Knowledge Sources -- Chapter 3. Knowledge-augmented Methods for Natural Language Understanding -- Chapter 4. Knowledge-augmented Methods for Natural Language Generation -- Chapter 5. Augmenting NLP Models with Commonsense Knowledge -- Chapter 6. Summary and Future Directions.
520 _aOver the last few years, natural language processing has seen remarkable progress due to the emergence of larger-scale models, better training techniques, and greater availability of data. Examples of these advancements include GPT-4, ChatGPT, and other pre-trained language models. These models are capable of characterizing linguistic patterns and generating context-aware representations, resulting in high-quality output. However, these models rely solely on input-output pairs during training and, therefore, struggle to incorporate external world knowledge, such as named entities, their relations, common sense, and domain-specific content. Incorporating knowledge into the training and inference of language models is critical to their ability to represent language accurately. Additionally, knowledge is essential in achieving higher levels of intelligence that cannot be attained through statistical learning of input text patterns alone. In this book, we will review recent developments in the field of natural language processing, specifically focusing on the role of knowledge in language representation. We will examine how pre-trained language models like GPT-4 and ChatGPT are limited in their ability to capture external world knowledge and explore various approaches to incorporate knowledge into language models. Additionally, we will discuss the significance of knowledge in enabling higher levels of intelligence that go beyond statistical learning on input text patterns. Overall, this survey aims to provide insights into the importance of knowledge in natural language processing and highlight recent advances in this field.
650 0 _aNatural language processing (Computer science).
650 0 _aComputational linguistics.
650 0 _aData mining.
650 1 4 _aNatural Language Processing (NLP).
650 2 4 _aComputational Linguistics.
650 2 4 _aData Mining and Knowledge Discovery.
700 1 _aLin, Bill Yuchen.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
700 1 _aWang, Shuohang.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
700 1 _aXu, Yichong.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
700 1 _aYu, Wenhao.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
700 1 _aZhu, Chenguang.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
710 2 _aSpringerLink (Online service)
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9789819707461
776 0 8 _iPrinted edition:
_z9789819707485
776 0 8 _iPrinted edition:
_z9789819707492
830 0 _aSpringerBriefs in Computer Science,
_x2191-5776
856 4 0 _uhttps://doi.org/10.1007/978-981-97-0747-8
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
942 _cSPRINGER
999 _c187614
_d187614