首页> 外文会议>IEEE Annual Computer Software and Applications Conference >Long Short-Term Memory Neural Networks for Artificial Dialogue Generation
【24h】

Long Short-Term Memory Neural Networks for Artificial Dialogue Generation

机译:用于人工对话一代的长短短期记忆神经网络

获取原文

摘要

This paper investigates both of user and system modeling to extend an existing corpus of human-machine dialogue data with simulated/artificial dialogues. To simulate and generate such artificial dialogues, a long-short term memory (LSTM) neural network system is proposed. The LSTM neural network is an Encoder-Decoder built on a bidirectional multilayer architecture where the input sequence to the encoder is a list of user dialogue acts and the decoder output sequence is a list of system dialogue acts. All dialogue acts are defined at the intent level and are extracted from the TownInfo corpus for tourist information provided by the FP7 CLASSiC Project funded by European Union. The proposed LSTM configuration is compared to a fully connected Hidden Markov Model (HMM) based architecture where the states are the user dialogues acts and the observations are the system dialogue acts. After carrying out different experiments, the results obtained on the TownInfo corpus showed that the LSTM-based system outperforms the HMM-based system.
机译:本文调查了用户和系统建模,以扩展具有模拟/人工对话的人机对话数据的现有语料库。为了模拟和生成这种人工对话,提出了一种长短术语存储器(LSTM)神经网络系统。 LSTM神经网络是基于双向多层架构的编码器解码器,其中对编码器的输入序列是用户对话作用的列表,并且解码器输出序列是系统对话框的列表。所有对话行为都在意图级别定义,并从Towninfo Corpist of Towninfo Corpis of Towninfo Corpis,以获得由欧盟资助的FP7经典项目提供的旅游信息。将所提出的LSTM配置与基于完全连接的隐马尔可夫模型(HMM)的架构进行比较,在那里状态是用户对话作用,观察是系统对话框。在进行不同的实验后,在TowninFO语料库上获得的结果表明,基于LSTM的系统优于基于肝的系统。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号