【24h】

Convex Two-Layer Modeling

机译:凸两层建模

获取原文

摘要

Latent variable prediction models, such as multi-layer networks, impose auxiliary latent variables between inputs and outputs to allow automatic inference of implicit features useful for prediction. Unfortunately, such models are difficult to train because inference over latent variables must be performed concurrently with parameter optimization-creating a highly non-convex problem. Instead of proposing another local training method, we develop a convex relaxation of hidden-layer conditional models that admits global training. Our approach extends current convex modeling approaches to handle two nested nonlinearities separated by a non-trivial adaptive latent layer. The resulting methods are able to acquire two-layer models that cannot be represented by any single-layer model over the same features, while improving training quality over local heuristics.
机译:潜在的可变预测模型,例如多层网络,在输入和输出之间施加辅助潜变量,以允许自动推断用于预测的隐式功能。不幸的是,这种模型难以训练,因为必须与参数优化 - 创建高度非凸面问题同时进行潜在变量的推断。我们不是提出另一种本地训练方法,我们开发了凸出的隐藏层条件模型,承认全球培训。我们的方法扩展了电流凸面建模方法,以处理由非普通自适应潜在潜在的两个嵌套非线性分开。由此产生的方法能够在相同的功能上获取不能由任何单层模型表示的两层模型,同时提高了当地启发式的培训质量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号