Tampere University of Technology

TUTCRIS Research Portal

Optimization equivalence of divergences improves neighbor embedding

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Details

Original languageEnglish
Title of host publication31st International Conference on Machine Learning, ICML 2014
PublisherInternational Machine Learning Society (IMLS)
Pages1808-1839
Number of pages32
Volume2
ISBN (Electronic)9781634393973
Publication statusPublished - 2014
Publication typeA4 Article in a conference publication
Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
Duration: 21 Jun 201426 Jun 2014

Conference

Conference31st International Conference on Machine Learning, ICML 2014
CountryChina
CityBeijing
Period21/06/1426/06/14

Abstract

Visualization methods that arrange data objects in 2D or 3D layouts have followed two main schools, methods oriented for graph layout and methods oriented for vectorial embedding. We show the two previously separate approaches are tied by an optimization equivalence, making it possible to relate methods from the two approaches and to build new methods that take the best of both worlds. In detail, we prove a theorem of optimization equivalences between β- And γ-, as well as α- And Rényi-divergences through a connection scalar. Through the equivalences we represent several nonlinear dimensionality reduction and graph drawing methods in a generalized stochastic neighbor embedding setting, where information divergences are minimized between similarities in input and output spaces, and the optimal connection scalar provides a natural choice for the tradeoff between attractive and repulsive forces. We give two ex-amples of developing new visualization methods through the equivalences: 1) We develop weighted symmetric stochastic neighbor embedding (ws-SNE) from Elastic Embedding and analyze its benefits, good performance for both vectorial and network data; in experiments ws-SNE has good performance across data sets of different types, whereas comparison methods fail for some of the data sets; 2) we develop a γ- divergence version of a Poly Log layout method; the new method is scale invariant in the output space and makes it possible to efficiently use large-scale smoothed neighborhoods.