Loading [MathJax]/extensions/MathMenu.js
Image Super-Resolution via Attention Based Back Projection Networks | IEEE Conference Publication | IEEE Xplore

Image Super-Resolution via Attention Based Back Projection Networks


Abstract:

Deep learning based image Super-Resolution (SR) has shown rapid development due to its ability of big data digestion. Generally, deeper and wider networks can extract ric...Show More

Abstract:

Deep learning based image Super-Resolution (SR) has shown rapid development due to its ability of big data digestion. Generally, deeper and wider networks can extract richer feature maps and generate SR images with remarkable quality. However, the more complex network we have, the more time consumption is required for practical applications. It is important to have a simplified network for efficient image SR. In this paper, we propose an Attention based Back Projection Network (ABPN) for image super-resolution. Similar to some recent works, we believe that the back projection mechanism can be further developed for SR. Enhanced back projection blocks are suggested to iteratively update low-and high-resolution feature residues. Inspired by recent studies on attention models, we propose a Spatial Attention Block (SAB) to learn the cross-correlation across features at different layers. Based on the assumption that a good SR image should be close to the original LR image after down-sampling. We propose a Refined Back Projection Block (RBPB) for final reconstruction. Extensive experiments on some public and AIM2019 Image Super-Resolution Challenge datasets show that the proposed ABPN can provide state-of-the-art or even better performance in both quantitative and qualitative measurements.
Date of Conference: 27-28 October 2019
Date Added to IEEE Xplore: 05 March 2020
ISBN Information:

ISSN Information:

Conference Location: Seoul, Korea (South)
No metrics found for this document.

1. Introduction

As a fundamental low-level vision problem, image super-resolution (SR) attracts much attention in the past few years. The objective of image SR is to super-resolve low-resolution (LR) images to the desired dimension as the same high-resolution (HR) images with pleasing visual quality. For a x image SR, we need to approximate a x a times pixels for up-sampling. Thanks to the architectural innovations and computation advances, it is possible to utilize larger datasets and more complex models for image SR. Various deep learning based approaches with different network architectures have achieved image SR with good quality. Most SR works are based on the residual mapping modified from ResNet [12]. In order to deliver good super-resolution quality, we need to build a very deep network to cover receptive fields of the image as large as possible to learn different levels of feature abstrction. The advent of 4K/8K UHD (Ultra High Definition) displays demand for more accurate image SR with less computation at different up-sampling factors. It is essential to have a deep neural network with the ability to capture long-term dependencies to efficiently learn the reconstruction mapping for SR. Attention or non-local modeling is one of the choices to globally capture the feature response across the whole image. A lot of related works [31], [7], [26], [27], [15], [5] have been proposed for computing vision successfully. There are several advantages of using attention operations: 1) It can directly compute the correlation between patterns across the image regardless of their distances; 2) It can efficiently reduce the number of kernels and depth of the network to achieve comparable or even better performance and 3) Finally, it is also easy to be embedded into any structure for operations. As shown in Figure 1, we tested the state-of-the-art SR approaches on 16x enlargement by applying two times of 4x SR using pre-trained models. ESRGAN [28] and RCAN [31] tend to generate fake edges which do not exist in the HR images while the proposed ABPN can still predict correct patterns.

SR results on image hinagikiukenzan with SR factor 16. We applied 2 times of 4x sr

Usage
Select a Year
2025

View as

Total usage sinceMar 2020:559
00.511.522.53JanFebMarAprMayJunJulAugSepOctNovDec120000000000
Year Total:3
Data is updated monthly. Usage includes PDF downloads and HTML views.

References

References is not available for this document.