Deep Learning Approaches to Text Production. Shashi Narayan
Чтение книги онлайн.

Читать онлайн книгу Deep Learning Approaches to Text Production - Shashi Narayan страница 4

СКАЧАТЬ href="#litres_trial_promo">5.3.2Graph-Based Triple Encoder for RDF Generation

       5.3.3Graph Convolutional Networks as Graph Encoders

       5.4Summary

       6Modelling Task-Specific Communication Goals

       6.1Task-Specific Knowledge for Content Selection

       6.1.1Selective Encoding to Capture Salient Information

       6.1.2Bottom-Up Copy Attention for Content Selection

       6.1.3Graph-Based Attention for Salient Sentence Detection

       6.1.4Multi-Instance and Multi-Task Learning for Content Selection

       6.2Optimising Task-Specific Evaluation Metric with Reinforcement Learning

       6.2.1The Pitfalls of Cross-Entropy Loss

       6.2.2Text Production as a Reinforcement Learning Problem

       6.2.3Reinforcement Learning Applications

       6.3User Modelling in Neural Conversational Model

       6.4Summary

       PART IIIData Sets and Conclusion

       7Data Sets and Challenges

       7.1Data Sets for Data-to-Text Generation

       7.1.1Generating Biographies from Structured Data

       7.1.2Generating Entity Descriptions from Sets of RDF Triples

       7.1.3Generating Summaries of Sports Games from Box-Score Data

       7.2Data Sets for Meaning Representations to Text Generation

       7.2.1Generating from Abstract Meaning Representations

       7.2.2Generating Sentences from Dependency Trees

       7.2.3Generating from Dialogue Moves

       7.3Data Sets for Text-to-Text Generation

       7.3.1Summarisation

       7.3.2Simplification

       7.3.3Compression

       7.3.4Paraphrasing

       8Conclusion

       8.1Summarising

       8.2Overview of Covered Neural Generators

       8.3Two Key Issues with Neural NLG

       8.4Challenges

       8.5Recent Trends in Neural NLG

       Bibliography

       Authors’ Biographies

       List of Figures

       1.1Input contents and communicative goals for text production

       1.2Shallow dependency tree from generation challenge surface realisation task

       1.3Example input from the SemEval AMR-to-Text Generation Task

       1.4E2E dialogue move and text

       1.5Data-to-Text example input and output pair

       2.1A Robocup input and output pair example

       2.2Data to text: A pipeline architecture

       2.3Simplifying a sentence

       2.4A Sentence/Compression pair

       2.5Abstractive vs. extractive summarisation

       2.6A document/summary pair from the CNN/DailyMail data set

       2.7Key modules in pre-neural approaches to text production

       3.1Deep learning for text generation

       3.2Feed-forward neural network or multi-layer perceptron

       СКАЧАТЬ