Loading [MathJax]/extensions/MathMenu.js
Road Extraction by Deep Residual U-Net | IEEE Journals & Magazine | IEEE Xplore

Road Extraction by Deep Residual U-Net


Abstract:

Road extraction from aerial images has been a hot research topic in the field of remote sensing image analysis. In this letter, a semantic segmentation neural network, wh...Show More

Abstract:

Road extraction from aerial images has been a hot research topic in the field of remote sensing image analysis. In this letter, a semantic segmentation neural network, which combines the strengths of residual learning and U-Net, is proposed for road area extraction. The network is built with residual units and has similar architecture to that of U-Net. The benefits of this model are twofold: first, residual units ease training of deep networks. Second, the rich skip connections within the network could facilitate information propagation, allowing us to design networks with fewer parameters, however, better performance. We test our network on a public road data set and compare it with U-Net and other two state-of-the-art deep-learning-based road extraction methods. The proposed approach outperforms all the comparing methods, which demonstrates its superiority over recently developed state of the arts.
Published in: IEEE Geoscience and Remote Sensing Letters ( Volume: 15, Issue: 5, May 2018)
Page(s): 749 - 753
Date of Publication: 08 March 2018

ISSN Information:

Funding Agency:


I. Introduction

Road extraction is one of the fundamental tasks in the field of remote sensing. It has a wide range of applications such as automatic road navigation, unmanned vehicles, urban planning, and geographic information update. Although it has been received considerable attentions in the past decade, road extraction from high-resolution remote sensing images is still a challenging task, because of the noise, occlusions, and complexity of the background in raw remote sensing imagery.

Contact IEEE to Subscribe

References

References is not available for this document.