首页> 外文会议>International Conference on Disability, Virtual Reality and Associated Technologies >Detection and computational analysis of psychological signals using a virtual human interviewing agent
【24h】

Detection and computational analysis of psychological signals using a virtual human interviewing agent

机译:使用虚拟人访谈代理的心理信号检测和计算分析

获取原文

摘要

It has long been recognized that facial expressions, body posture/gestures and vocal parameters play an important role in human communication and the implicit signalling of emotion. Recent advances in low cost computer vision and behavioral sensing technologies can now be applied to the process of making meaningful inferences as to user state when a person interacts with a computational device. Effective use of this additive information could serve to promote human interaction with virtual human (VH) agents that may enhance diagnostic assessment. The same technology could also be leveraged to improve engagement in teletherapy approaches between remote patients and care providers. This paper will focus on our current research in these areas within the DARPA-funded "Detection and Computational Analysis of Psychological Signals" project, with specific attention to the SimSensei application use case. SimSensei is a virtual human interaction platform that is able to sense and interpret real-time audiovisual behavioral signals from users interacting with the system. It is specifically designed for health care support and leverages years of virtual human research and development at USC-ICT. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the state and inferred intent of the user through analysis of behavioral signals gleaned from facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human to human interaction and communication, SimSensei aims to capture and infer from user non-verbal communication to improve engagement between a VH and a user. The system can also quantify and interpret sensed behavioral signals longitudinally that can be used to inform diagnostic assessment within a clinical context.
机译:已经很久认识到,面部表情,身体姿势/手势和声乐参数在人类交流中发挥着重要作用和情感的隐含信号。现在可以应用低成本计算机视觉和行为传感技术的最新进展,以便当人与计算设备交互时对用户状态进行有意义的推论的过程。有效地使用这种添加剂信息可以用于促进可能提高诊断评估的虚拟人(VH)药物的人类相互作用。还可以利用相同的技术来改善远程患者和护理提供者之间的直链时间方法的参与。本文将专注于我们目前在DARPA资助的“心理信号检测和计算分析”项目中的目前的研究,特别关注Simsensei应用程序用例。 Simsensei是一个虚拟人类交互平台,能够从与系统交互的用户感测和解释实时视听行为信号。它专门用于保健支持和USC-ICT的虚拟人类研发年份。该平台使得能够通过从面部表情,身体手势和声乐参数收集的行为信号的分析来实现从事面对面的相互作用。类似于非言语行为信号如何对人类交互和通信产生影响,Simsensei旨在从用户非语言交流捕获和推断,以改善VH和用户之间的接触。该系统还可以纵向量化和解释可感知的行为信号,其可用于通知临床环境内的诊断评估。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号