首页> 外文期刊>Pattern recognition letters >Learning off-line vs. on-line models of interactive multimodal behaviors with recurrent neural networks
【24h】

Learning off-line vs. on-line models of interactive multimodal behaviors with recurrent neural networks

机译:使用递归神经网络学习交互式多模式行为的离线模型与在线模型

获取原文
获取原文并翻译 | 示例
           

摘要

Human interactions are driven by multi-level perception-action loops. Interactive behavioral models are typically built using rule-based methods or statistical approaches such as Hidden Markov Model (HMM), Dynamic Bayesian Network (DBN), etc. In this paper, we present the multimodal interactive data and our behavioral model based on recurrent neural networks, namely Long-Short Term Memory (LSTM) and Bidirectional LSTM (BiLSTM) models. Speech, gaze and gestures of two subjects involved in a collaborative task are here jointly modeled. The results show that the proposed deep neural networks are more effective than the conventional statistical methods in generating appropriate overt actions for both on-line and off-line prediction tasks. (C) 2017 Elsevier B.V. All rights reserved.
机译:人际互动是由多层次的感知-行动循环驱动的。交互式行为模型通常使用基于规则的方法或统计方法(例如隐马尔可夫模型(HMM),动态贝叶斯网络(DBN)等)构建。在本文中,我们介绍了多模式交互式数据和基于递归神经网络的行为模型。网络,即长短期记忆(LSTM)和双向LSTM(BiLSTM)模型。此处共同模拟了参与协作任务的两个主题的语音,凝视和手势。结果表明,所提出的深度神经网络在生成适用于在线和离线预测任务的适当公开动作方面比常规统计方法更有效。 (C)2017 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号