Название: Deep Learning Approaches to Text Production
Автор: Shashi Narayan
Издательство: Ingram
Жанр: Программы
Серия: Synthesis Lectures on Human Language Technologies
isbn: 9781681738215
isbn:
5.3.3Graph Convolutional Networks as Graph Encoders
6Modelling Task-Specific Communication Goals
6.1Task-Specific Knowledge for Content Selection
6.1.1Selective Encoding to Capture Salient Information
6.1.2Bottom-Up Copy Attention for Content Selection
6.1.3Graph-Based Attention for Salient Sentence Detection
6.1.4Multi-Instance and Multi-Task Learning for Content Selection
6.2Optimising Task-Specific Evaluation Metric with Reinforcement Learning
6.2.1The Pitfalls of Cross-Entropy Loss
6.2.2Text Production as a Reinforcement Learning Problem
6.2.3Reinforcement Learning Applications
6.3User Modelling in Neural Conversational Model
PART IIIData Sets and Conclusion
7.1Data Sets for Data-to-Text Generation
7.1.1Generating Biographies from Structured Data
7.1.2Generating Entity Descriptions from Sets of RDF Triples
7.1.3Generating Summaries of Sports Games from Box-Score Data
7.2Data Sets for Meaning Representations to Text Generation
7.2.1Generating from Abstract Meaning Representations
7.2.2Generating Sentences from Dependency Trees
7.2.3Generating from Dialogue Moves
7.3Data Sets for Text-to-Text Generation
8.2Overview of Covered Neural Generators
8.3Two Key Issues with Neural NLG
8.5Recent Trends in Neural NLG
List of Figures
1.1Input contents and communicative goals for text production
1.2Shallow dependency tree from generation challenge surface realisation task
1.3Example input from the SemEval AMR-to-Text Generation Task
1.5Data-to-Text example input and output pair
2.1A Robocup input and output pair example
2.2Data to text: A pipeline architecture
2.4A Sentence/Compression pair
2.5Abstractive vs. extractive summarisation
2.6A document/summary pair from the CNN/DailyMail data set
2.7Key modules in pre-neural approaches to text production
3.1Deep learning for text generation