...
首页> 外文期刊>Journal of medical imaging and radiation sciences >Intraobserver Variability: Should We Worry?
【24h】

Intraobserver Variability: Should We Worry?

机译:观察者内部变异性:我们应该担心吗?

获取原文
获取原文并翻译 | 示例
           

摘要

Many papers have identified concerns about intraobserver variability of repeat outlining by the same clinician. These variations in individual performance in turn make it challenging to determine values for interobserver variability since these depend largely on the assumption that each observer's oudine is accurate. Aside from the concerns about inaccuracy, variability is a potential component of the planning target volume margin and thus minimization of this has the potential to reduce normal tissue dose and morbidity. One accepted measure of intraobserver agreement since I960 1 has been the Kappa (k) correlation coefficient, which varies from 0 (agreement by chance) to 1 (full agreement). The accepted subdivisions of kappa 2 are "excellent" (0.81-1.00), "good" (0.61-0.80), "moderate" (0.41-0.60), "fair" (0.21-0.40), and "poor" (0-0.20). It is clear from the evidence base that kappa is common to many aspects of medical practice. Despite the kappa assumptions concerning observer independence 3, it has been used extensively to report both intraobserver and interobserver variability in the interpretation of CT imaging data. Table 1 summarizes the results of these studies from the last 10 years.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号