Loading [MathJax]/extensions/MathMenu.js
Improving Super Resolution Methods Via Incremental Residual Learning | IEEE Conference Publication | IEEE Xplore

Improving Super Resolution Methods Via Incremental Residual Learning


Abstract:

Recently, Convolutional Neural Networks (CNNs) have shown promising performance in super-resolution (SR). However, these methods operate primarily on Low Resolution (LR) ...Show More

Abstract:

Recently, Convolutional Neural Networks (CNNs) have shown promising performance in super-resolution (SR). However, these methods operate primarily on Low Resolution (LR) inputs for memory efficiency but this limits, as we demonstrate, their ability to (i) model high frequency information; and (ii) smoothly translate from LR to High Resolution (HR) space. To this end, we propose a novel Incremental Residual Learning (IRL) framework to address these mentioned issues. In IRL, first we select a typical SR pre-trained network as a master branch. Next we sequentially train and add residual branches to the main branch, where each residual branch is learned to model accumulated residuals of all previous branches. We plug state of the art methods in IRL framework and demonstrate consistent performance improvement on public benchmark datasets to set a new state of the art for SR at only ≈ 20% increase in training time.
Date of Conference: 22-25 September 2019
Date Added to IEEE Xplore: 26 August 2019
ISBN Information:

ISSN Information:

Conference Location: Taipei, Taiwan

Contact IEEE to Subscribe

References

References is not available for this document.