Role Play Dialogue Aware Language Models Based on Conditional Hierarchical Recurrent Encoder-Decoder
Ryo Masumura, Tomohiro Tanaka, Atsushi Ando, Hirokazu Masataki and Yushi Aono
Abstract:
We propose role play dialogue-aware language models (RPDA-LMs) that can leverage interactive contexts in role play multi-turn dialogues for estimating the generative probability of words. Our motivation is to improve automatic speech recognition (ASR) performance in role play dialogues such as contact center dialogues and service center dialogues. Although long short-term memory recurrent neural network based language models (LSTM-RNN-LMs) can capture long-range contexts within an utterance, they cannot utilize sequential interactive information between speakers in multi-turn dialogues. Our idea is to explicitly leverage speakers' roles of individual utterances, which are often available in role play dialogues, for neural language modeling. The RPDA-LMs are represented as a generative model conditioned by a role sequence of a target role play dialogue. We compose the RPDA-LMs by extending hierarchical recurrent encoder-decoder modeling so as to handle the role information. Our ASR evaluation in a contact center dialogue demonstrates that RPDA-LMs outperform LSTM-RNN-LMs and document-context LMs in terms of perplexity and word error rate. In addition, we verify the effectiveness of explicitly taking interactive contexts into consideration.
Cite as: Masumura, R., Tanaka, T., Ando, A., Masataki, H., Aono, Y. (2018) Role Play Dialogue Aware Language Models Based on Conditional Hierarchical Recurrent Encoder-Decoder. Proc. Interspeech 2018, 1259-1263, DOI: 10.21437/Interspeech.2018-2185.
BiBTeX Entry:
@inproceedings{Masumura2018,
author={Ryo Masumura and Tomohiro Tanaka and Atsushi Ando and Hirokazu Masataki and Yushi Aono},
title={Role Play Dialogue Aware Language Models Based on Conditional Hierarchical Recurrent Encoder-Decoder},
year=2018,
booktitle={Proc. Interspeech 2018},
pages={1259--1263},
doi={10.21437/Interspeech.2018-2185},
url={http://dx.doi.org/10.21437/Interspeech.2018-2185} }