Loading [MathJax]/extensions/MathMenu.js
MSRIP-Net: Addressing Interpretability and Accuracy Challenges in Aircraft Fine-Grained Recognition of Remote Sensing Images | IEEE Journals & Magazine | IEEE Xplore

MSRIP-Net: Addressing Interpretability and Accuracy Challenges in Aircraft Fine-Grained Recognition of Remote Sensing Images


Abstract:

The task of fine-grained aircraft recognition is crucial in the field of remote sensing. Despite some progress achieved by traditional deep learning methods in addressing...Show More

Abstract:

The task of fine-grained aircraft recognition is crucial in the field of remote sensing. Despite some progress achieved by traditional deep learning methods in addressing this challenge, they are often perceived as a “black box,” lacking transparent explanations for model decisions. Current interpretable methods based on attention mechanisms, although providing some interpretability, do not align with human thought logic. Therefore, we propose a multiscale rotation-invariant prototype network (MSRIP-Net). Our approach simulates the intuitive reasoning process of humans in identifying objects by segmenting them into multiple components. Importantly, MSRIP-Net has the capability to automatically recognize rigid components on aircraft targets without relying on additional part annotations, using only image-level class labels. In addition, our approach effectively addresses challenges presented by noise, deformations, and multiscale variations in remote sensing targets and has been comprehensively evaluated on datasets FAIR1M1.0 and Rareplane. Our results demonstrate that MSRIP-Net achieves higher accuracy compared with existing fine-grained recognition methods. Furthermore, we provide insights into the model’s decision-making process to illustrate the interpretability of our approach.
Article Sequence Number: 5640217
Date of Publication: 11 September 2024

ISSN Information:

Funding Agency:

References is not available for this document.

I. Introduction

With the development of remote sensing technology, the aircraft fine-grained recognition is a crucial task in the interpretation of remote sensing images, which plays a vital role in military and civilian fields, such as mission planning, security, surveillance, and military decision making [1], [2], [3], [4], [5]. Therefore, in practical application scenarios, achieving high-precision classification results alone is no longer sufficient, what is imperative is the utilization of highly interpretable methods.

Select All
1.
Q. Liu, X. Xiang, Y. Wang, Z. Luo and F. Fang, "Aircraft detection in remote sensing image based on corner clustering and deep learning", Eng. Appl. Artif. Intell., vol. 87, Jan. 2020.
2.
T. Shi et al., "Complex optical remote-sensing aircraft detection dataset and benchmark", IEEE Trans. Geosci. Remote Sens., vol. 61, 2023.
3.
A. Zhao et al., "Aircraft recognition based on landmark detection in remote sensing images", IEEE Geosci. Remote Sens. Lett., vol. 14, no. 8, pp. 1413-1417, Aug. 2017.
4.
G. Liu, X. Sun, K. Fu and H. Wang, "Aircraft recognition in high-resolution satellite images using Coarse-to-Fine shape prior", IEEE Geosci. Remote Sens. Lett., vol. 10, no. 3, pp. 573-577, May 2013.
5.
F. Zhang, B. Du, L. Zhang and M. Xu, "Weakly supervised learning based on coupled convolutional neural networks for aircraft detection", IEEE Trans. Geosci. Remote Sens., vol. 54, no. 9, pp. 5553-5563, Sep. 2016.
6.
W. Diao, X. Sun, F. Dou, M. Yan, H. Wang and K. Fu, "Object recognition in remote sensing images using sparse deep belief networks", Remote Sens. Lett., vol. 6, no. 10, pp. 745-754, 2015.
7.
X. Li, B. Jiang, S. Wang, L. Shen and Y. Fu, "A human–computer fusion framework for aircraft recognition in remote sensing images", IEEE Geosci. Remote Sens. Lett., vol. 17, no. 2, pp. 297-301, Feb. 2020.
8.
H. Wang, Y. Gong, Y. Wang, L. Wang and C. Pan, "DeepPlane: A unified deep model for aircraft detection and recognition in remote sensing images", J. Appl. Remote Sens., vol. 11, no. 4, 2017.
9.
Y. Nie, C. Bian and L. Li, "Adap-EMD: Adaptive EMD for aircraft fine-grained classification in remote sensing", IEEE Geosci. Remote Sens. Lett., vol. 19, 2022.
10.
A. Krizhevsky, I. Sutskever and G. E. Hinton, "ImageNet classification with deep convolutional neural networks", Commun. ACM, vol. 60, no. 6, pp. 84-90, May 2017.
11.
K. He, G. Gkioxari, P. Dollár and R. Girshick, "Mask R-CNN", Proc. IEEE Int. Conf. Comput. Vis. (ICCV), pp. 2980-2988, Oct. 2017.
12.
R. Girshick, "Fast R-CNN", Proc. IEEE Int. Conf. Comput. Vis. (ICCV), pp. 1440-1448, Dec. 2015.
13.
P. Wang and N. Vasconcelos, "A generalized explanation framework for visualization of deep learning model predictions", IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 8, pp. 9265-9283, Aug. 2023.
14.
D. Chang et al., "Making a bird AI expert work for you and me", IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 10, pp. 12068-12084, Oct. 2023.
15.
B. Zhou, A. Khosla, A. Lapedriza, A. Oliva and A. Torralba, "Learning deep features for discriminative localization", Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 2921-2929, Jun. 2016.
16.
R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh and D. Batra, "Grad-CAM: Visual explanations from deep networks via gradient-based localization", Proc. IEEE Int. Conf. Comput. Vis. (ICCV), pp. 618-626, Oct. 2017.
17.
Z. Huang and Y. Li, "Interpretable and accurate fine-grained recognition via region grouping", Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 8662-8672, Jun. 2020.
18.
J. Wagner, J. M. Köhler, T. Gindele, L. Hetzel, J. T. Wiedemer and S. Behnke, "Interpretable and fine-grained visual explanations for convolutional neural networks", Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 9097-9107, Jun. 2019.
19.
P. Shaw, J. Uszkoreit and A. Vaswani, "Self-attention with relative position representations", arXiv:1803.02155, 2018.
20.
W. Xiong, Z. Xiong and Y. Cui, "An explainable attention network for fine-grained ship classification using remote-sensing images", IEEE Trans. Geosci. Remote Sens., vol. 60, 2022.
21.
Y. Yi, Y. You, C. Li and W. Zhou, "EFM-net: An essential feature mining network for target fine-grained classification in optical remote sensing images", IEEE Trans. Geosci. Remote Sens., vol. 61, 2023.
22.
Y. Han, X. Yang, T. Pu and Z. Peng, "Fine-grained recognition for oriented ship against complex scenes in optical remote sensing images", IEEE Trans. Geosci. Remote Sens., vol. 60, 2022.
23.
Y. Fu, Z. Liu and Z. Zhang, "Progressive learning vision transformer for open set recognition of fine-grained objects in remote sensing images", IEEE Trans. Geosci. Remote Sens., vol. 61, 2023.
24.
C. Pan, R. Li, Q. Hu, C. Niu, W. Liu and W. Lu, "Contrastive learning network based on causal attention for fine-grained ship classification in remote sensing scenarios", Remote Sens., vol. 15, no. 13, pp. 3393, Jul. 2023.
25.
C. Chen, O. Li, D. Tao, A. Barnett, C. Rudin and J. K. Su, "This looks like that: Deep learning for interpretable image recognition", Proc. Adv. Neural Inf. Process. Syst., vol. 32, pp. 1-12, 2019.
26.
H. Bilen and A. Vedaldi, "Weakly supervised deep detection networks", Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 2846-2854, Jun. 2016.
27.
K. Yang, D. Li and Y. Dou, "Towards precise end-to-end weakly supervised object detection network", Proc. IEEE/CVF Int. Conf. Comput. Vis. (ICCV), pp. 8372-8381, Oct. 2019.
28.
P. Tang, X. Wang, X. Bai and W. Liu, "Multiple instance detection network with online instance classifier refinement", Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 2843-2851, Jul. 2017.
29.
D. Li, J.-B. Huang, Y. Li, S. Wang and M.-H. Yang, "Progressive representation adaptation for weakly supervised object localization", IEEE Trans. Pattern Anal. Mach. Intell., vol. 42, no. 6, pp. 1424-1438, Jun. 2020.
30.
S. Ren, K. He, R. Girshick and J. Sun, "Faster R-CNN: Towards real-time object detection with region proposal networks", IEEE Trans. Pattern Anal. Mach. Intell., vol. 39, no. 6, pp. 1137-1149, Jun. 2017.
Contact IEEE to Subscribe

References

References is not available for this document.