首页> 外文会议>IEEE International Conference on Rehabilitation Robotics >An Egocentric Vision based Assistive Co-robot
【24h】

An Egocentric Vision based Assistive Co-robot

机译:基于Egocentric视觉的辅助公司

获取原文

摘要

We present the prototype of an egocentric vision based assistive co-robot system. In this co-robot system, the user is wearing a pair of glasses with a forward looking camera, and is actively engaged in the control loop of the robot in navigational tasks. The egocentric vision glasses serve for two purposes. First, it serves as a source of visual input to request the robot to find a certain object in the environment. Second, the motion patterns computed from the egocentric video associated with a specific set of head movements are exploited to guide the robot to find the object. These are especially helpful for quadriplegic individuals who do not have needed hand functionality for interaction and control with other modalities (e.g., joystick). In our co-robot system, when the robot does not fulfill the object finding task in a pre-specified time window, it would actively solicit user controls for guidance. Then the users can use the egocentric vision based gesture interface to orient the robot towards the direction of the object. After that the robot will automatically navigate towards the object until it finds it. Our experiments validated the efficacy of the closed-loop design to engage the human in the loop.
机译:我们介绍了基于Egocentric视觉辅助协处理系统的原型。在该合作系统中,用户戴着一对具有前瞻性相机的眼镜,并且在导航任务中主动地接合机器人的控制回路。 Egocentric视觉眼镜用于两个目的。首先,它用作可视输入的源,以请求机器人在环境中找到某个对象。其次,从与特定的一组头部移动相关联的自我传统视频计算的运动模式被利用以引导机器人找到对象。这些尤其有用,对于不需要使用其他方式的互动和控制(例如,操纵杆)的互动和控制而没有必要的手函数。在我们的共同机器人系统中,当机器人没有满足预先指定的时间窗口中的对象查找任务时,它将主动征求用户控制以获取指导。然后,用户可以使用基于Enocentric视觉的手势界面来向对象的方向方向方向方向定向机器人。之后,机器人将自动导航到对象,直到它找到它。我们的实验验证了闭环设计的功效与循环中的人啮合。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号