Processing math: 100%
Social Neighborhood Graph and Multigraph Fusion Ranking for Multifeature Image Retrieval | IEEE Journals & Magazine | IEEE Xplore

Social Neighborhood Graph and Multigraph Fusion Ranking for Multifeature Image Retrieval


Abstract:

A single feature is hard to describe the content of images from an overall perspective, which limits the retrieval performances of single-feature-based methods in image r...Show More

Abstract:

A single feature is hard to describe the content of images from an overall perspective, which limits the retrieval performances of single-feature-based methods in image retrieval tasks. To fully describe the properties of images and improve the retrieval performances, multifeature fusion ranking-based methods are proposed. However, the effectiveness of multifeature fusion in image retrieval has not been theoretically explained. This article gives a theoretical proof to illustrate the role of independent features in improving the retrieval results. Based on the theoretical proof, the original ranking list generated with a single feature greatly influences the performances of multifeature fusion ranking. Inspired by the principle of three degrees of influence in social networks, this article proposes a reranking method named k -nearest neighbors’ neighbors’ neighbors’ graph (N3G) to improve the original ranking list by a single feature. Furthermore, a multigraph fusion ranking (MFR) method motivated by the group relation theory in social networks for multifeature ranking is also proposed, which considers the correlations of all images in multiple neighborhood graphs. Evaluation experiments conducted on several representative data sets (e.g., UK-bench, Holiday, Corel-10K, and Cifar-10) validate that N3G and MFR outperform the other state-of-the-art methods.
Page(s): 1389 - 1399
Date of Publication: 17 April 2020

ISSN Information:

PubMed ID: 32310795

Funding Agency:

Citations are not available for this document.

I. Introduction

Image ranking has made a number of significant achievements in image retrieval tasks. Ranking methods have attracted increasing attention in image retrieval. In most cases, we usually utilize the L1-norm to measure the similarity for the statistical histogram-based image feature in ranking stage [1], [27], [39]. This direct similarity metric ranking results can be regarded as the K-nearest neighbors (KNN) of a query [in reranking methods known as a candidate KNN set (CKNNS)]. However, the KNN of a query is independent of each other, that is, there is no connection between the images of the retrieval results. In general, we assume that the KNN of a query (including query) are similar images and should be related in image retrieval. This relationship is conducive to the elimination of outlier in the CKNNS, which is conducive to enhance the performance of image retrieval. Image reranking methods are developed based on the CKNNS of the query.

Cites in Papers - |

Cites in Papers - IEEE (5)

Select All
1.
Bin Gu, Runxue Bao, Chenkang Zhang, Heng Huang, "New Scalable and Efficient Online Pairwise Learning Algorithm", IEEE Transactions on Neural Networks and Learning Systems, vol.35, no.12, pp.17099-17110, 2024.
2.
Luefeng Chen, Min Li, Min Wu, Witold Pedrycz, Kaoru Hirota, "Coupled Multimodal Emotional Feature Analysis Based on Broad-Deep Fusion Networks in Human–Robot Interaction", IEEE Transactions on Neural Networks and Learning Systems, vol.35, no.7, pp.9663-9673, 2024.
3.
Maosen Li, Siheng Chen, Yanning Shen, Genjia Liu, Ivor W. Tsang, Ya Zhang, "Online Multi-Agent Forecasting With Interpretable Collaborative Graph Neural Networks", IEEE Transactions on Neural Networks and Learning Systems, vol.35, no.4, pp.4768-4782, 2024.
4.
Karim Gasmi, Hatem Aouadi, Mouna Torjmen, "Link-Driven Study to Enhance Text-Based Image Retrieval: Implicit Links Versus Explicit Links", IEEE Access, vol.11, pp.90526-90537, 2023.
5.
Yawen Zeng, Yiru Wang, Dongliang Liao, Gongfu Li, Weijie Huang, Jin Xu, Da Cao, Hong Man, "Keyword-Based Diverse Image Retrieval With Variational Multiple Instance Graph", IEEE Transactions on Neural Networks and Learning Systems, vol.34, no.12, pp.10528-10537, 2023.

Cites in Papers - Other Publishers (3)

1.
Yawen Zeng, Yiru Wang, Dongliang Liao, Gongfu Li, Jin Xu, Hong Man, Bo Liu, Xiangmin Xu, "Contrastive topic-enhanced network for video captioning", Expert Systems with Applications, vol.237, pp.121601, 2024.
2.
Wentao Ma, Tongqing Zhou, Jiaohua Qin, Xuyu Xiang, Yun Tan, Zhiping Cai, "Adaptive multi-feature fusion via cross-entropy normalization for effective image retrieval", Information Processing & Management, vol.60, no.1, pp.103119, 2023.
3.
Debanjan Pathak, U. S. N. Raju, "Content-based image retrieval using Group Normalized-Inception-Darknet-53", International Journal of Multimedia Information Retrieval, vol.10, no.3, pp.155, 2021.
Contact IEEE to Subscribe

References

References is not available for this document.