語系:
繁體中文
English
簡体中文
說明(常見問題)
圖書館個人資料蒐集告知聲明
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Embeddings in natural language proce...
~
Camacho-Collados, Jose.
Embeddings in natural language processing : theory and advances in vector representations of meaning
紀錄類型:
書目-語言資料,印刷品 : 單行本
副題名:
theory and advances in vector representations of meaning
作者:
PilehvarMohammad Taher.,
其他作者:
Camacho-ColladosJose.,
出版地:
[San Rafael, California]
出版者:
Morgan & Claypool Publishers;
出版年:
c2021.
面頁冊數:
xvii, 157 p.ill. : 24 cm.;
集叢名:
Synthesis lectures on human language technologieslecture #47
標題:
Natural language processing (Computer science) -
標題:
Artificial intelligence. -
標題:
Programming languages (Electronic computers) - Semantics. -
摘要註:
"Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature."--Back cover.
ISBN:
9781636390215
內容註:
1. Introduction 1.1. Semantic representation 1.2. Vector space models 1.3. The evolution path of representations 2. Background 2.1. Natural language processing fundamentals 2.2 deep learning for NLP 2.3. Knowledge resources 3. Word embeddings 3.1. Count-based models 3.2. Predictive models 3.3. Character embeddings 3.4. Knowledge-enhanced word embeddings 3.5. Cross-lingual word embeddings 3.6. Evaluation 4. Graph embeddings 4.1. Node embedding 4.2. Knowledge-based relation embeddings 4.3. Unsupervised relation embeddings 4.4. Applications and evaluation 5. Sense embeddings 5.1. Unsupervised sense embeddings 5.2. Knowledge-based sense embeddings 5.3. Evaluation and application 6. Contextualized embeddings 6.1. The need for contextualization. 6.2. Background : transformer model 6.3. Contextualized word embeddings 6.4. Transformer-based models : BERT 6.5. Extensions 6.6. Feature extraction and finetuning 6.7. Analysis and evaluation 7. Sentence and document embeddings 7.1. Unsupervised sentence embeddings 7.2. Supervised sentence embeddings 7.3. Document embeddings 7.4. Application and evaluation 8. Ethics and bias 8.1. Bias in word embeddings 8.2. Debiasing word embeddings 9. Conclusions.
Embeddings in natural language processing : theory and advances in vector representations of meaning
Pilehvar, Mohammad Taher.
Embeddings in natural language processing
: theory and advances in vector representations of meaning / Mohammad Taher Pilehvar, Jose Camacho-Collados. - [San Rafael, California] : Morgan & Claypool Publishers, c2021.. - xvii, 157 p. ; ill. ; 24 cm.. - (Synthesis lectures on human language technologies ; lecture #47).
1. Introduction.
Includes bibliographical references (p. 111-155).
ISBN 9781636390215ISBN 1636390218ISBN 9781636390239ISBN 1636390234
Natural language processing (Computer science)Artificial intelligence.Programming languages (Electronic computers) -- Semantics.
Camacho-Collados, Jose.
Embeddings in natural language processing : theory and advances in vector representations of meaning
LDR
:03510cam a2200277 450
001
393621
010
1
$a
9781636390215
$b
pbk.
$d
NT1808
010
1
$a
1636390218
$b
pbk.
010
1
$a
9781636390239
$b
hbk.
010
1
$a
1636390234
$b
hbk.
010
1
$z
9781636390222
$b
ebk.
100
$a
20210910d2021 k y0engy50 b
101
0
$a
eng
102
$a
us
105
$a
a a 000yy
200
1
$a
Embeddings in natural language processing
$e
theory and advances in vector representations of meaning
$f
Mohammad Taher Pilehvar, Jose Camacho-Collados.
210
$a
[San Rafael, California]
$c
Morgan & Claypool Publishers
$d
c2021.
215
1
$a
xvii, 157 p.
$c
ill.
$d
24 cm.
225
2
$a
Synthesis lectures on human language technologies
$v
lecture #47
$x
1947-4040
320
$a
Includes bibliographical references (p. 111-155)
327
1
$a
1. Introduction
$a
1.1. Semantic representation
$a
1.2. Vector space models
$a
1.3. The evolution path of representations
$a
2. Background
$a
2.1. Natural language processing fundamentals
$a
2.2 deep learning for NLP
$a
2.3. Knowledge resources
$a
3. Word embeddings
$a
3.1. Count-based models
$a
3.2. Predictive models
$a
3.3. Character embeddings
$a
3.4. Knowledge-enhanced word embeddings
$a
3.5. Cross-lingual word embeddings
$a
3.6. Evaluation
$a
4. Graph embeddings
$a
4.1. Node embedding
$a
4.2. Knowledge-based relation embeddings
$a
4.3. Unsupervised relation embeddings
$a
4.4. Applications and evaluation
$a
5. Sense embeddings
$a
5.1. Unsupervised sense embeddings
$a
5.2. Knowledge-based sense embeddings
$a
5.3. Evaluation and application
$a
6. Contextualized embeddings
$a
6.1. The need for contextualization.
$a
6.2. Background : transformer model
$a
6.3. Contextualized word embeddings
$a
6.4. Transformer-based models : BERT
$a
6.5. Extensions
$a
6.6. Feature extraction and finetuning
$a
6.7. Analysis and evaluation
$a
7. Sentence and document embeddings
$a
7.1. Unsupervised sentence embeddings
$a
7.2. Supervised sentence embeddings
$a
7.3. Document embeddings
$a
7.4. Application and evaluation
$a
8. Ethics and bias
$a
8.1. Bias in word embeddings
$a
8.2. Debiasing word embeddings
$a
9. Conclusions.
330
$a
"Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature."--Back cover.
410
0
$1
2001
$a
Synthesis lectures on human language technologies
$v
47.
410
0
$1
2001
$a
Synthesis lectures on human language technologies
$v
lecture #47
$x
1947-4040
606
$a
Natural language processing (Computer science)
$2
lc
$3
66899
606
$a
Artificial intelligence.
$2
lc
$3
27475
606
$a
Programming languages (Electronic computers)
$x
Semantics.
$2
lc
$3
385626
676
$a
006.3/5
$v
23
680
$a
QA76.9.N38
$b
P554 2021
700
1
$a
Pilehvar
$b
Mohammad Taher.
$3
385624
702
1
$a
Camacho-Collados
$b
Jose.
$3
385625
801
0
$a
cw
$b
CTU
$c
20210910
$g
AACR2
筆 0 讀者評論
館藏地:
全部
六樓西文書庫區
出版年:
卷號:
館藏
期刊年代月份卷期操作說明(Help)
1 筆 • 頁數 1 •
1
條碼號
典藏地名稱
館藏流通類別
資料類型
索書號
使用類型
借閱狀態
預約人數
期刊出刊日期 / 原館藏地 / 其他備註
附件
400527
六樓西文書庫區
圖書流通(BOOK_CIR)
BOOK
006.35/P637
一般使用(Normal)
書架上
0
1 筆 • 頁數 1 •
1
評論
新增評論
分享你的心得
建立或儲存個人書籤
書目轉出
取書館別
處理中
...
變更密碼
登入