基于渐进式交叉学习的单张图像超分辨率
DOI:
作者:
作者单位:

1.华东交通大学信息与软件工程学院;2.华东交通大学理学院

作者简介:

通讯作者:

中图分类号:

基金项目:

国家自然科学(No. 62501237,62462033,62163016) ;江西省自然科学(No.20252BAC200016,20242BAB25092);江西省职业早期青年科技人才培养专项项目(No.20244BCE52163)


Progressive Cross-Learning for Single-Image Super-Resolution
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    单张图像超分辨率是一个具有挑战性的不适定问题。当前基于卷积神经网络的方法存在性能瓶颈,而Transformer模型虽能通过全局建模提升性能,却受限于计算复杂度难以实现效率平衡。为此,提出了一种渐进式交叉学习网络(CLNet),通过集成超稠密空洞残差模块(UD2B)与增强型Transformer模块(ETB),构建协同工作的渐进式架构。UD2B通过多尺度空洞卷积聚合高低频特征以增强局部表征,ETB则通过跨通道自注意力建立长程依赖关系以捕获全局上下文。还提出了跨特征与跨层级注意力融合模块(C2AFB),通过自适应学习实现多层次特征的有效融合。在多个基准数据集上的实验表明,CLNet在客观指标与视觉感知质量上均优于现有先进方法,实现了性能与效率的较好平衡。

    Abstract:

    Single-image super-resolution is a challenging ill-posed problem. Current methods based on convolutional neural networks face performance bottlenecks, while Transformer models, though capable of improving performance through global modeling, struggle to achieve computational efficiency due to their high computational complexity. Therefore, we propose a progressive Cross-Learning Network (CLNet) that integrates ultra-dense dilated residual blocks (UD2B) with enhanced Transformer blocks (ETB) to construct a synergistic progressive architecture. UD2B aggregates high- and low-frequency features through multi-scale dilated convolutions to enhance local representations, while ETB establishes long-range dependencies via cross-channel self-attention to capture global context. We also introduce a Cross-Feature and Cross-Level Attention Fusion Block (C2AFB) that achieves effective fusion of multi-level features through adaptive learning. Experiments on multiple benchmark datasets demonstrate that CLNet outperforms existing methods in both objective metrics and visual perceptual quality, achieving a favorable balance between performance and efficiency.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2025-11-26
  • 最后修改日期:2026-03-04
  • 录用日期:2026-03-13
  • 在线发布日期: 2026-03-20
  • 出版日期:
关闭