首页> 中文期刊> 《北京交通大学学报》 >基于深度特征蒸馏的人脸识别

基于深度特征蒸馏的人脸识别

         

摘要

Deep learning has been widely used in face recognition system due to its powerful ability in feature representation.However,the high inferring complexity and feature representation re-duce the efficiencies in feature extraction and retrieval respectively,which hinders the practical deployments of face recognition system.To address these issues,this paper proposes deep feature distillation in order to uniformly compress the deep network parameters and feature dimensions by distilling the knowledge from large teacher network and domain related data via multi-task deep learning.Combined feature regression and face classification,the method uses a pre-trained large depth network as a teacher network to guide the training of small network,which the knowledge transferred to the lightweight student network to achieve efficient feature extraction. The experimental results on LFW benchmark show that in the condition of the student model rec-ognition accuracy is reduced by 3.7% compared with the teacher model,the network has been compressed to about 2×107 in model size and 128 dimensional feature,which achieves the reduc-tions of 7.1 times in model parameters,32 times in feature dimension and 95.1% in inferring complexity.The results demonstrate the validity and efficiency of the proposed method.%在人脸识别系统中,深度学习由于强大的表征能力被广泛应用,但模型推理的高计算复杂度和特征表示的高维度分别降低了特征提取和特征检索的效率,阻碍了人脸识别系统的实际部署.为了克服这两个问题,本文提出一种基于深度特征蒸馏的人脸识别方法,该方法通过多任务学习实现大深度模型知识与领域相关数据信息的蒸馏,从而统一地压缩深度网络参数及特征维度.联合特征回归与人脸分类,以预训练的大网络为教师网络,指导小网络训练,将知识迁移得到轻量级的学生网络,实现了高效的特征提取.在LFW人脸识别数据集上进行了实验,学生模型在识别精度相比教师模型下降3.7%的情况下,模型参数压缩到约2×107、特征维度降到128维,相比教师模型分别获得了7.1倍的参数约减、32倍的特征降维及95.1%的推理复杂度下降,表明了方法的有效性和高效性.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号