Loading web-font TeX/Math/Italic
Social Neighborhood Graph and Multigraph Fusion Ranking for Multifeature Image Retrieval | IEEE Journals & Magazine | IEEE Xplore

Social Neighborhood Graph and Multigraph Fusion Ranking for Multifeature Image Retrieval


Abstract:

A single feature is hard to describe the content of images from an overall perspective, which limits the retrieval performances of single-feature-based methods in image r...Show More

Abstract:

A single feature is hard to describe the content of images from an overall perspective, which limits the retrieval performances of single-feature-based methods in image retrieval tasks. To fully describe the properties of images and improve the retrieval performances, multifeature fusion ranking-based methods are proposed. However, the effectiveness of multifeature fusion in image retrieval has not been theoretically explained. This article gives a theoretical proof to illustrate the role of independent features in improving the retrieval results. Based on the theoretical proof, the original ranking list generated with a single feature greatly influences the performances of multifeature fusion ranking. Inspired by the principle of three degrees of influence in social networks, this article proposes a reranking method named k -nearest neighbors’ neighbors’ neighbors’ graph (N3G) to improve the original ranking list by a single feature. Furthermore, a multigraph fusion ranking (MFR) method motivated by the group relation theory in social networks for multifeature ranking is also proposed, which considers the correlations of all images in multiple neighborhood graphs. Evaluation experiments conducted on several representative data sets (e.g., UK-bench, Holiday, Corel-10K, and Cifar-10) validate that N3G and MFR outperform the other state-of-the-art methods.
Page(s): 1389 - 1399
Date of Publication: 17 April 2020

ISSN Information:

PubMed ID: 32310795

Funding Agency:

References is not available for this document.

I. Introduction

Image ranking has made a number of significant achievements in image retrieval tasks. Ranking methods have attracted increasing attention in image retrieval. In most cases, we usually utilize the L1-norm to measure the similarity for the statistical histogram-based image feature in ranking stage [1], [27], [39]. This direct similarity metric ranking results can be regarded as the K-nearest neighbors (KNN) of a query [in reranking methods known as a candidate KNN set (CKNNS)]. However, the KNN of a query is independent of each other, that is, there is no connection between the images of the retrieval results. In general, we assume that the KNN of a query (including query) are similar images and should be related in image retrieval. This relationship is conducive to the elimination of outlier in the CKNNS, which is conducive to enhance the performance of image retrieval. Image reranking methods are developed based on the CKNNS of the query.

Select All
1.
D. Androutsos, K. N. Plataniotiss and A. N. Venetsanopoulos, "Distance measures for color image retrieval", Proc. Int. Conf. Image Process. (ICIP), vol. 2, pp. 770-774, 1998.
2.
S. Bai and X. Bai, "Sparse contextual activation for efficient visual re-ranking", IEEE Trans. Image Process., vol. 25, no. 3, pp. 1056-1069, Mar. 2016.
3.
S. Bai, S. Sun, X. Bai, Z. Zhang and Q. Tian, "Smooth neighborhood structure mining on multiple affinity graphs with applications to context-sensitive similarity", Proc. Eur. Conf. Comput. Vis., pp. 592-608, 2016.
4.
S. Bai, P. Tang, P. H. S. Torr and L. J. Latecki, "Re-ranking via metric fusion for object retrieval and person re-identification", Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit. (CVPR), pp. 740-749, Jun. 2019.
5.
S. Bai, Z. Zhou, J. Wang, X. Bai, L. J. Latecki and Q. Tian, "Ensemble diffusion for retrieval", Proc. IEEE Int. Conf. Comput. Vis. (ICCV), pp. 774-783, Oct. 2017.
6.
X. Bai, B. Wang, C. Yao, W. Liu and Z. Tu, "Co-transduction for shape retrieval", IEEE Trans. Image Process., vol. 21, no. 5, pp. 2747-2757, May 2012.
7.
J. Chen, T. Nakashika, T. Takiguchi and Y. Ariki, "Content-based image retrieval using rotation-invariant histograms of oriented gradients", Proc. 5th ACM Int. Conf. Multimedia Retr. (ICMR), pp. 443-446, 2015.
8.
S. K. Walker, "Connected: The surprising power of our social networks and how they shape our lives", J. Family Theory Rev., vol. 3, no. 3, pp. 220-224, Sep. 2011.
9.
C. Deng, R. Ji, W. Liu, D. Tao and X. Gao, "Visual reranking through weakly supervised multi-graph learning", Proc. IEEE Int. Conf. Comput. Vis., pp. 2600-2607, Dec. 2013.
10.
I. Dimitrovski, D. Kocev, S. Loskovska and S. Džeroski, "Improving bag-of-visual-words image retrieval with predictive clustering trees", Inf. Sci., vol. 329, pp. 851-865, Feb. 2016.
11.
J.-M. Guo, H. Prasetyo and H.-S. Su, "Image indexing using the color and bit pattern feature fusion", J. Vis. Commun. Image Represent., vol. 24, no. 8, pp. 1360-1379, Nov. 2013.
12.
J. He, M. Li, H.-J. Zhang, H. Tong and C. Zhang, "Manifold-ranking based image retrieval", Proc. 12th Annu. ACM Int. Conf. Multimedia, pp. 9-16, 2004.
13.
J. He, M. Li, H.-J. Zhang, H. Tong and C. Zhang, "Generalized manifold-ranking-based image retrieval", IEEE Trans. Image Process., vol. 15, no. 10, pp. 3170-3177, Oct. 2006.
14.
Y. Huang, Q. Liu, S. Zhang and D. N. Metaxas, "Image retrieval via probabilistic hypergraph ranking", Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 3376-3383, Jun. 2010.
15.
J. Jiang et al., "Understanding latent interactions in online social networks", ACM Trans. Web, vol. 7, no. 4, pp. 369-382, Oct. 2013.
16.
J. Krapac, M. Allan, J. Verbeek and F. Juried, "Improving Web image search results using query-relative classifiers", Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 1094-1101, Jun. 2010.
17.
A. Krizhevsky, I. Sutskever and G. E. Hinton, "ImageNet classification with deep convolutional neural networks", Commun. ACM, vol. 60, no. 6, pp. 84-90, May 2017.
18.
G. Gilbert, "Distance between sets", Nature, vol. 239, no. 5368, pp. 174, Sep. 1972.
19.
D. Li and Y. Du, Artificial Intelligence With Uncertainty, Boca Raton, FL, USA:CRC Press, 2007.
20.
C.-H. Lin, R.-T. Chen and Y.-K. Chan, "A smart content-based image retrieval system based on color and texture feature", Image Vis. Comput., vol. 27, no. 6, pp. 658-665, May 2009.
21.
H. Ling, X. Yang and L. J. Latecki, "Balancing deformability and discriminability for shape matching", Eur. Conf. Comput. Vis., pp. 411-424, 2010.
22.
G.-H. Liu, J.-Y. Yang and Z. Li, "Content-based image retrieval using computational visual attention model", Pattern Recognit., vol. 48, no. 8, pp. 2554-2566, Aug. 2015.
23.
H. Liu, J. Feng, M. Qi, J. Jiang and S. Yan, "End-to-end comparative attention networks for person re-identification", IEEE Trans. Image Process., vol. 26, no. 7, pp. 3492-3506, Jul. 2017.
24.
S. Liu, L. Feng, Y. Liu, J. Wu, M. X. Sun and W. Wang, "Robust discriminative extreme learning machine for relevance feedback in image retrieval", Multidimensional Syst. Signal Process., vol. 28, pp. 1071-1089, Jul. 2016.
25.
S. Liu et al., "Perceptual uniform descriptor and ranking on manifold for image retrieval", Inf. Sci., vol. 424, pp. 235-249, Jan. 2018.
26.
Z. Liu, S. Wang, L. Zheng and Q. Tian, "Robust ImageGraph: Rank-level feature fusion for image search", IEEE Trans. Image Process., vol. 26, no. 7, pp. 3128-3141, Jul. 2017.
27.
W. Lu, A. L. Varna, A. Swaminathan and M. Wu, "Secure image retrieval through feature protection", Proc. IEEE Int. Conf. Acoust. Speech Signal Process., pp. 1533-1536, Apr. 2009.
28.
X. Lu, J. Wang, J., Y. Hou, M. Yang, Q. Wang and X. Zhang, "Hierarchical image retrieval by multi-feature fusion".
29.
L. Luo, C. Shen, C. Zhang and A. van den Hengel, "Shape similarity analysis by self-tuning locally constrained mixed-diffusion", IEEE Trans. Multimedia, vol. 15, no. 5, pp. 1174-1183, Aug. 2013.
30.
G. Park, Y. Baek and H.-K. Lee, "Re-ranking algorithm using post-retrieval clustering for content-based image retrieval", Inf. Process. Manage., vol. 41, no. 2, pp. 177-194, Mar. 2005.
Contact IEEE to Subscribe

References

References is not available for this document.