首页> 外文会议>Workshop on biomedical natural language processing 2015 >Shallow Training is cheap but is it good enough? Experiments with Medical Fact Coding
【24h】

Shallow Training is cheap but is it good enough? Experiments with Medical Fact Coding

机译:浅层训练很便宜,但足够好吗?医学事实编码实验

获取原文
获取原文并翻译 | 示例

摘要

A typical NLP system for medical fact coding uses multiple layers of supervision involving fact-attributes, relations and coding. Training such a system involves expensive and laborious annotation process involving all layers of the pipeline. In this work, we investigate the feasibility of a shallow medical coding model that trains only on fact annotations, while disregarding fact-attributes and relations, potentially saving considerable annotation time and costs. Our results show that the shallow system, despite using less supervision, is only 1.4% F1 points behind the multi-layered system on Disorders, and contrary to expectation, is able to improve over the latter by about 2.4% F1 points on Procedure facts. Further, our experiments also show that training the shallow system using only sentence-level fact labels with no span information has no negative effect on performance, indicating further cost savings through weak supervision.
机译:用于医学事实编码的典型NLP系统使用多层监督,涉及事实属性,关系和编码。培训这样的系统涉及昂贵且费力的注释过程,涉及管道的所有层。在这项工作中,我们研究了仅在事实注释上进行训练的浅层医学编码模型的可行性,而忽略了事实属性和关系,有可能节省大量注释时间和成本。我们的结果表明,尽管使用了较少的监督,但浅层系统仅比“多层无障碍”系统落后F1点1.4%,并且与预期相反,在过程事实方面,后者可以将后者提高约2.4%F1点。此外,我们的实验还表明,仅使用没有跨度信息的句子级事实标签来训练浅层系统对性能没有负面影响,这表明通过弱监督可以进一步节省成本。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号