Learning a Kernel Matrix for Nonlinear Dimensionality Reduction
Kilian Weinberger - University of Pennsylvania
Fei Sha - University of Pennsylvania
Lawrence Saul - University of Pennsylvania
We investigate how to learn a kernel matrix for high dimensional data thatlies on or near a low dimensional manifold. Noting that the kernel matriximplicitly maps the data into a nonlinear feature space, we show how todiscover a mapping that "unfolds" the nderlying manifold from which the datawas sampled. The kernel matrix is constructed by maximizing the variance infeature space subject to local constraints that preserve the angles anddistances between nearest neighbors. The main optimization involves aninstance of semidefinite programming---a fundamentally different computationthan previous algorithms for manifold learning, such as Isomap and locallylinear embedding. The optimized kernels perform better than polynomial andGaussian kernels for problems in manifold learning, but worse for problems inlarge margin classification. We explain these results in terms of thegeometric properties of different kernels and comment on variousinterpretations of other manifold learning algorithms as kernel methods.