Abstract
At present, manifold learning is mainly applied to dimensionality reduction. However, from viewpoint of dimensionality reduction, manifold learning algorithms are only local feature preserving algorithms. For example, Local Linear Embedding is local linear preserving, Local Tangent Space Alignment is local homeomorphic preserving and Laplacian Eigenmap is local similarity preserving. The community of dimensionality reduction is now pursuing the algorithms which can preserve both local and global features of data during dimensionality reduction. In this paper, a new algorithm of dimensionality reduction, called Hilbert-Schmidt Independence Criterion Regularized Manifold Learning (HSIC-ML for short), is proposed, in which HSIC between the high dimensional data and the dimension-reduced data is added as a regularization term to the objective functions of manifold learning. The addition of HSIC regularization term makes HSIC-ML capable of preserving both local and global features during dimensionality reduction. HSIC is a criterion measuring the statistical dependence between two data sets and has been widely applied to machine learning in recent years. However, since HSIC was first proposed around 2005, there seems to have not been applied directly to dimensionality reduction, not applied as a regularization term either. The proposed HSIC-ML may be the first try in this respect. The experimental results presented in this paper show that the manifold learning with HSIC regularization performs better than that without HSIC regularization.
Keywords
Get full access to this article
View all access options for this article.
