As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Story Ending Generation is a task of generating a coherent and sensible ending for a given story. The key challenges of this task are i) how to obtain a good understanding of context, ii) how to capture hidden information between lines, and iii) how to obtain causal progression. However, recent machine learning models can only partially address these challenges due to the lack of causal entailment and consistency. The key novelty in our proposed approach is to capture the hidden story by generating transitional commonsense sentences between each adjacent context sentence, which substantially enriches causal and consistent story flow. Specifically, we adopt a soft causal relation using people’s everyday commonsense knowledge to mimic the cognitive understanding process of readers. We then enrich the story with causal reasoning and utilize dependency parsing to capture long range text relations. Finally, we apply multi-level Graph Convolutional Networks to deliver enriched contextual information across different layers. Both automatic and human evaluation results show that our proposed model can significantly improve the quality of generated story endings.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.