Differentiating through the fŕechet mean
Research output: Contribution to journal › Conference article › Research
Standard
Differentiating through the fŕechet mean. / Lou, Aaron; Katsman, Isay; Jiang, Qingxuan; Belongie, Serge; Lim, Ser Nam; Sa, Christopher De.
In: 37th International Conference on Machine Learning, ICML 2020, 2020, p. 6349-6359.Research output: Contribution to journal › Conference article › Research
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Differentiating through the fŕechet mean
AU - Lou, Aaron
AU - Katsman, Isay
AU - Jiang, Qingxuan
AU - Belongie, Serge
AU - Lim, Ser Nam
AU - Sa, Christopher De
N1 - Publisher Copyright: © 2020 37th International Conference on Machine Learning, ICML 2020. All rights reserved.
PY - 2020
Y1 - 2020
N2 - Recent advances in deep representation learning on Riemannian manifolds extend classical deep learning operations to better capture the geometry of the manifold. One possible extension is the Fŕechet mean, the generalization of the Euclidean mean; however, it has been difficult to apply because it lacks a closed form with an easily computable derivative. In this paper, we show how to differentiate through the Fŕechet mean for arbitrary Riemannian manifolds. Then, focusing on hyperbolic space, we derive explicit gradient expressions and a fast, accurate, and hyperparameter-free Fŕechet mean solver. This fully integrates the Fŕechet mean into the hyperbolic neural network pipeline. To demonstrate this integration, we present two case studies. First, we apply our Fŕechet mean to the existing Hyperbolic Graph Convolutional Network, replacing its projected aggregation to obtain state-of-The-Art results on datasets with high hyperbolicity. Second, to demonstrate the Fŕechet mean s capacity to generalize Euclidean neural network operations, we develop a hyperbolic batch normalization method that gives an improvement parallel to the one observed in the Euclidean setting.
AB - Recent advances in deep representation learning on Riemannian manifolds extend classical deep learning operations to better capture the geometry of the manifold. One possible extension is the Fŕechet mean, the generalization of the Euclidean mean; however, it has been difficult to apply because it lacks a closed form with an easily computable derivative. In this paper, we show how to differentiate through the Fŕechet mean for arbitrary Riemannian manifolds. Then, focusing on hyperbolic space, we derive explicit gradient expressions and a fast, accurate, and hyperparameter-free Fŕechet mean solver. This fully integrates the Fŕechet mean into the hyperbolic neural network pipeline. To demonstrate this integration, we present two case studies. First, we apply our Fŕechet mean to the existing Hyperbolic Graph Convolutional Network, replacing its projected aggregation to obtain state-of-The-Art results on datasets with high hyperbolicity. Second, to demonstrate the Fŕechet mean s capacity to generalize Euclidean neural network operations, we develop a hyperbolic batch normalization method that gives an improvement parallel to the one observed in the Euclidean setting.
UR - http://www.scopus.com/inward/record.url?scp=85105139516&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85105139516
SP - 6349
EP - 6359
JO - 37th International Conference on Machine Learning, ICML 2020
JF - 37th International Conference on Machine Learning, ICML 2020
T2 - 37th International Conference on Machine Learning, ICML 2020
Y2 - 13 July 2020 through 18 July 2020
ER -
ID: 301817763