首页> 外文会议>International joint conference on natural language processing;Conference on empirical methods in natural language processing >Rotate King to get Queen: Word Relationships as Orthogonal Transformations in Embedding Space
【24h】

Rotate King to get Queen: Word Relationships as Orthogonal Transformations in Embedding Space

机译:旋转国王获得女王:嵌入空间中的正交变换作为单词关系

获取原文

摘要

A notable property of word embeddings is that word relationships can exist as linear substructures in the embedding space. For example, gender corresponds to woman — man and queen -king. This, in turn, allows word analogies to be solved arithmetically: king — man + woman ≈ queen. This property is notable because it suggests that models trained on word embeddings can easily learn such relationships as geometric translations. However, there is no evidence that models exclusively represent relationships in this manner. We document an alternative way in which downstream models might learn these relationships: orthogonal and linear transformations. For example, given a translation vector for gender, we can find an orthogonal matrix R, representing a rotation and reflection, such that R(king) ≈ queen and R(man) ≈ woman. Analogical reasoning using orthogonal transformations is almost as accurate as using vector arithmetic; using linear transformations is more accurate than both. Our findings suggest that these transformations can be as good a representation of word relationships as translation vectors.
机译:单词嵌入的一个显着特性是单词关系可以作为线性子结构存在于嵌入空间中。例如,性别对应于女人-男人和王后-王。反过来,这又可以用数学方法解决单词类比:国王-男人+女人≈女王。此属性很显着,因为它表明对单词嵌入进行训练的模型可以轻松地学习诸如几何翻译之类的关系。但是,没有证据表明模型以这种方式专门表示关系。我们记录了下游模型可以学习这些关系的另一种方法:正交和线性变换。例如,给定性别的翻译向量,我们可以找到代表旋转和反射的正交矩阵R,这样R(king)≈Queen和R(man)≈woman。使用正交变换的类比推理几乎与使用矢量算法一样准确。使用线性变换比两者都更准确。我们的发现表明,这些转换可以很好地代表单词关系,就像翻译向量一样。

著录项

相似文献

  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号