首页> 外文期刊>Australian & New Zealand journal of statistics >Fast and approximate exhaustive variable selection for generalised linear models with APES
【24h】

Fast and approximate exhaustive variable selection for generalised linear models with APES

机译:具有APE的广义线性型号的快速和近似的穷举选择

获取原文
获取原文并翻译 | 示例
           

摘要

We present APproximated Exhaustive Search (APES), which enables fast and approximated exhaustive variable selection in Generalised Linear Models (GLMs). While exhaustive variable selection remains as the gold standard in many model selection contexts, traditional exhaustive variable selection suffers from computational feasibility issues. More precisely, there is often a high cost associated with computing maximum likelihood estimates (MLE) for all subsets of GLMs. Efficient algorithms for exhaustive searches exist for linear models, most notably the leaps-and-bound algorithm and, more recently, the mixed integer optimisation (MIO) algorithm. The APES method learns from observational weights in a generalised linear regression super-model and reformulates the GLM problem as a linear regression problem. In this way, APES can approximate a true exhaustive search in the original GLM space. Where exhaustive variable selection is not computationally feasible, we propose a best-subset search, which also closely approximates a true exhaustive search. APES is made available in both as a standalone R package as well as part of the already existing mplot package.
机译:我们呈现近似的详尽搜索(APE),其在广义线性模型(GLM)中实现了快速和近似的穷举变量选择。在许多模型选择上下文中穷举变量选择作为金标,而传统的详尽变量选择遭受了计算可行性问题。更精确地,对于所有GLM的所有子集,通常存在高成本与计算最大似然估计(MLE)相关。用于线性模型的详尽搜索的高效算法,最值得注意的是跳跃和绑定算法,最近,混合整数优化(MIO)算法。 APES方法在广义线性回归超模型中从观察权重中学习,并将GLM问题重新结重新介绍作为线性回归问题。以这种方式,APE可以近似于原始GLM空间中真正的详尽搜索。如果穷举变量选择没有计算不可行,我们提出了一个最佳子集搜索,这也非常近似于真正的详尽搜索。 APES是可提供的,作为独立R包以及已存在的MPLOT包的一部分。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号