Hierarchical Similarity Transformations Between Gaussian Mixtures | IEEE Journals & Magazine | IEEE Xplore

Hierarchical Similarity Transformations Between Gaussian Mixtures


Abstract:

In this paper, we propose a method to estimate the density of a data space represented by a geometric transformation of an initial Gaussian mixture model. The geometric t...Show More

Abstract:

In this paper, we propose a method to estimate the density of a data space represented by a geometric transformation of an initial Gaussian mixture model. The geometric transformation is hierarchical, and it is decomposed into two steps. At first, the initial model is assumed to undergo a global similarity transformation modeled by translation, rotation, and scaling of the model components. Then, to increase the degrees of freedom of the model and allow it to capture fine data structures, each individual mixture component may be transformed by another, local similarity transformation, whose parameters are distinct for each component of the mixture. In addition, to constrain the order of magnitude of the local transformation (LT) with respect to the global transformation (GT), zero-mean Gaussian priors are imposed onto the local parameters. The estimation of both GT and LT parameters is obtained through the expectation maximization framework. Experiments on artificial data are conducted to evaluate the proposed model, with varying data dimensionality, number of model components, and transformation parameters. In addition, the method is evaluated using real data from a speech recognition task. The obtained results show a high model accuracy and demonstrate the potential application of the proposed method to similar classification problems.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 24, Issue: 11, November 2013)
Page(s): 1824 - 1835
Date of Publication: 28 June 2013

ISSN Information:

PubMed ID: 24808615
Citations are not available for this document.

I. Introduction

Gaussian mixture models (GMMs) were extensively studied with applications in many domains, such as density estimation [1], clustering [2], [3], classification [4], image registration [5], [6], and regression [7], [8]. There are two main issues in the application of mixture models. The first is the estimation of model parameters. Parameter estimation is generally based on the maximum likelihood (ML) or maximum a posteriori (MAP) expectation maximization (EM) algorithm [9]–[11] or its variational extensions [12], [13]. The second issue is the choice of the number of mixture components. There are cases where the number of components is known a priori, (e.g., some classification problems). In the majority of applications, this number is, however, unknown [14]–[16], [18].

Cites in Papers - |

Cites in Papers - IEEE (3)

Select All
1.
Yucheng Shu, Zhenlong Liao, Bin Xiao, Weisheng Li, Xinbo Gao, "Registration-Is-Evaluation: Robust Point Set Matching With Multigranular Prior Assessment", IEEE Transactions on Geoscience and Remote Sensing, vol.60, pp.1-14, 2022.
2.
Xun Shen, Yahui Zhang, Kota Sata, Tielong Shen, "Gaussian Mixture Model Clustering-Based Knock Threshold Learning in Automotive Engines", IEEE/ASME Transactions on Mechatronics, vol.25, no.6, pp.2981-2991, 2020.
3.
Cormac O’Meadhra, Wennie Tabib, Nathan Michael, "Variable Resolution Occupancy Mapping Using Gaussian Mixture Models", IEEE Robotics and Automation Letters, vol.4, no.2, pp.2015-2022, 2019.

Cites in Papers - Other Publishers (1)

1.
Christos Bellos, George Rigas, Ioannis F. Spiridon, Athanasios Bibas, Dimitra Iliopoulou, Frank Bohnke, Dimitrios Koutsouris, Dimitrios I. Fotiadis, "Reconstruction of Cochlea Based on Micro-CT and Histological Images of the Human Inner Ear", BioMed Research International, vol.2014, pp.1, 2014.
Contact IEEE to Subscribe

References

References is not available for this document.