Representation Learning for Natural Language Processing, 2nd Edition
 
  
 
Book Details
| Authors | Zhiyuan Liu, Yankai Lin, Maosong Sun | 
| Publisher | Springer | 
| Published | 2023 | 
| Edition | 2nd | 
| Paperback | 521 pages | 
| Language | English | 
| ISBN-13 | 9789819915996, 9789819916023, 9789819916009 | 
| ISBN-10 | 9819915996, 981991602X, 9819916003 | 
| License | Creative Commons Attribution | 
Book Description
 This open book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learning techniques for multiple language entries, including words, sentences and documents, as well as pre-training techniques. Part II then introduces the related representation techniques to NLP, including graphs, cross-modal entries, and robustness. Part III then introduces the representation techniques for the knowledge that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, legal domain knowledge and biomedical domain knowledge. Lastly, Part IV discusses the remaining challenges and future research directions.
The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, socialnetwork analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing. 
This book is available under a Creative Commons Attribution license (CC BY), which means that you are free to copy, distribute, and modify it, as long as you give appropriate credit to the original author.
If you enjoyed the book and would like to support the author, you can purchase a printed copy (hardcover or paperback) from official retailers.
Download and Read Links
Share this Book
[localhost]# find . -name "*Similar_Books*"
Natural Language Processing with Transformers
Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformer
Deep Learning for Coders with Fastai and PyTorch
Deep learning is often viewed as the exclusive domain of math PhDs and big tech companies. But as this hands-on guide demonstrates, programmers comfortable with Python can achieve impressive results in deep learning with little math background, small amounts of data, and minimal code. How? With fastai, the first library to provide a consistent inte
Deep Learning with JavaScript
Deep learning has transformed the fields of computer vision, image processing, and natural language applications. Thanks to TensorFlow.js, now JavaScript developers can build deep learning apps without relying on Python or R. Deep Learning with JavaScript shows developers how they can bring DL technology to the web. Written by the main authors of t
R Notes for Professionals
The R Notes for Professionals book is compiled from Stack Overflow Documentation, the content is written by the beautiful people at Stack Overflow.
Clinical Text Mining
This open access book describes the results of natural language processing and machine learning methods applied to clinical text from electronic patient records. It is divided into twelve chapters. Chapters 1-4 discuss the history and background of the original paper-based patient records, their purpose, and how they are written and structured. The
Efficient Learning Machines
Machine learning techniques provide cost-effective alternatives to traditional methods for extracting underlying relationships between information and data and for predicting future events by processing existing information to train models. Efficient Learning Machines explores the major topics of machine learning, including knowledge discovery, cla
 
  
  
  
  
 