基于滤波衰减的自知识蒸馏压缩算法
DOI:
作者:
作者单位:

华东交通大学信息与软件工程学院,江西 南昌 330013

作者简介:

通讯作者:

熊李艳(1968—),女,教授,硕士生导师,研究方向为交通大数据。E-mail:445935939@qq.com。

中图分类号:

TP391.4

基金项目:

国家自然科学基金项目(62067002,61967006,62062033);江西省交通厅科技项目(2022X0040)


Self-Knowledge Distillation Compression Algorithm Based on Filter Attenuation
Author:
Affiliation:

School of Information and Software Engineering, East China Jiaotong University, Nanchang 330013 , China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    目的】为了解决模型剪枝后性能损失严重的问题,提出了一种基于滤波衰减和自知识蒸馏的压缩算法。【方法】文章通过滤波衰减机制来保留冗余滤波器的信息,进而缩小剪枝前后的模型差异,降低剪枝导致的性能损耗。同时,在剪枝过程中引入一个退火衰减函数,使得滤波器的衰减呈现动态变化,进而能够快速高效地搜索模型的最佳子结构,提高模型的收敛速度。此外,还利用自知识蒸馏技术在预训练模型和压缩模型之间进行知识转移。【结果】结果表明,该压缩算法在减少VGG-16 模型37.3%FLOPs的条件下,将模型精度提升了0.12个百分点。【结论】该方法能够为卷积神经网络提供一种更稳定、更高效的模型压缩方法。

    Abstract:

    Objective】To address the severe performance loss after model pruning,a compression algorithm based on filter attenuation and self-knowledge distillation is proposed. [Method] This method utilized a filter attenuation mechanism to preserve the information of redundant filters, thereby minimizing the model difference before and after pruning, and reducing the performance loss caused by pruning. Meanwhile, an annealing attenuation function was introduced during the pruning process to dynamically change the attenuation of filters, enabling a fast and efficient search for the optimal substructure of the model and improving the convergence speed of the model. Additionally, self-knowledge distillation was employed for knowledge transfer between the pre-trained model and the compressed model. [Result] The results show that this compression algorithm improves the model accuracy by 0.12 percentage points while reducing the FLOPs of the VGG-16 model by 37.3% . [Conclusion] This method provides a more stable and efficient model compression approach for convolutional neural networks.

    参考文献
    相似文献
    引证文献
引用本文

熊李艳,黄佳文,黄晓辉,陈庆森.基于滤波衰减的自知识蒸馏压缩算法[J].华东交通大学学报,2024,41(6):112-120.
Xiong Liyan, Huang Jiawen, Huang Xiaohui, Chen Qingsen. Self-Knowledge Distillation Compression Algorithm Based on Filter Attenuation[J]. JOURNAL OF EAST CHINA JIAOTONG UNIVERSTTY,2024,41(6):112-120

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-03-15
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2025-02-10
  • 出版日期: