# A Differentiable Gaussian-like Distribution on Hyperbolic Space for Gradient-Based Learning

@article{Nagano2019ADG, title={A Differentiable Gaussian-like Distribution on Hyperbolic Space for Gradient-Based Learning}, author={Yoshihiro Nagano and Shoichiro Yamaguchi and Yasuhiro Fujita and Masanori Koyama}, journal={ArXiv}, year={2019}, volume={abs/1902.02992} }

Hyperbolic space is a geometry that is known to be well-suited for representation learning of data with an underlying hierarchical structure. [...] Key Method Also, we can sample from this hyperbolic probability distribution without resorting to auxiliary means like rejection sampling. As applications of our distribution, we develop a hyperbolic-analog of variational autoencoder and a method of probabilistic word embedding on hyperbolic space. We demonstrate the efficacy of our distribution on various datasets… Expand

#### Figures, Tables, and Topics from this paper

#### 10 Citations

Hierarchical Representations with Poincaré Variational Auto-Encoders

- Mathematics, Computer Science
- ArXiv
- 2019

This work endow VAE with a Poincar\'e ball model of hyperbolic geometry and derive the necessary methods to work with two main Gaussian generalisations on that space. Expand

Mixed-curvature Variational Autoencoders

- Computer Science, Mathematics
- ICLR
- 2020

A Mixed-curvature Variational Autoencoder is developed, an efficient way to train a VAE whose latent space is a product of constant curvature Riemannian manifolds, where the per-component curvature is fixed or learnable. Expand

Continuous Hierarchical Representations with Poincaré Variational Auto-Encoders

- Computer Science, Mathematics
- NeurIPS
- 2019

This work endow VAEs with a Poincare ball model of hyperbolic geometry as a latent space and rigorously derive the necessary methods to work with two main Gaussian generalisations on that space. Expand

Increasing Expressivity of a Hyperspherical VAE

- 2019

Learning suitable latent representations for observed, high-dimensional data is an important research topic underlying many recent advances in machine learning. While traditionally the Gaussian… Expand

Increasing Expressivity of a Hyperspherical VAE

- Computer Science, Mathematics
- ArXiv
- 2019

This work proposes to extend the usability of hyperspherical parameterizations to higher dimensions using a product-space instead, showing improved results on a selection of image datasets. Expand

Poincaré Wasserstein Autoencoder

- Computer Science
- ArXiv
- 2019

This work presents a reformulation of the recently proposed Wasserstein autoencoder framework on a non-Euclidean manifold, the Poincare ball model of the hyperbolic space, and can use its intrinsic hierarchy to impose structure on the learned latent space representations. Expand

Understanding in Artificial Intelligence

- Computer Science
- ArXiv
- 2021

How progress has been made in benchmark development to measure understanding capabilities of AI methods is shown and as well how current methods develop understanding capabilities are reviewed. Expand

MOOC-Based Mixed Teaching Research on Microcomputer Principle Courses in Colleges and Universities

- Computer Science
- eLEOT
- 2019

This paper synthetically compares the advantages and disadvantages of MOOC teaching mode with traditional teaching mode and constructs a teaching platform based on MOOC and solves the problems of traditional teaching. Expand

Riemannian Continuous Normalizing Flows

- Mathematics, Computer Science
- NeurIPS
- 2020

Riemannian continuous normalizing flows is introduced, a model which admits the parametrization of flexible probability measures on smooth manifolds by defining flows as the solution to ordinary differential equations. Expand

Variational Autoencoders with Riemannian Brownian Motion Priors

- Computer Science, Mathematics
- ICML
- 2020

This work assumes a Riemannian structure over the latent space, which constitutes a more principled geometric view of the latent codes, and replaces the standard Gaussian prior with a R Siemannian Brownian motion prior, and demonstrates that this prior significantly increases model capacity using only one additional scalar parameter. Expand

#### References

SHOWING 1-10 OF 25 REFERENCES

A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning

- Mathematics, Computer Science
- ICML
- 2019

A novel hyperbolic distribution calledpseudo-hyperbolic Gaussian, a Gaussian-like distribution on hyper bolic space whose density can be evaluated analytically and differentiated with respect to the parameters, enables the gradient-based learning of the probabilistic models onHyperbolic space that could never have been considered before. Expand

Poincaré Embeddings for Learning Hierarchical Representations

- Computer Science, Mathematics
- NIPS
- 2017

This work introduces a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincare ball -- and introduces an efficient algorithm to learn the embeddings based on Riemannian optimization. Expand

Representation Tradeoffs for Hyperbolic Embeddings

- Computer Science, Mathematics
- ICML
- 2018

A hyperbolic generalization of multidimensional scaling (h-MDS), which offers consistently low distortion even with few dimensions across several datasets, is proposed and a PyTorch-based implementation is designed that can handle incomplete information and is scalable. Expand

Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry

- Computer Science, Mathematics
- ICML
- 2018

It is shown that an embedding in hyperbolic space can reveal important aspects of a company's organizational structure as well as reveal historical relationships between language families. Expand

Hyperbolic Attention Networks

- Computer Science
- ICLR
- 2019

This work introduces hyperbolic attention networks to endow neural networks with enough capacity to match the complexity of data with hierarchical and power-law structure and re-expressing the ubiquitous mechanism of soft attention in terms of operations defined for hyperboloid and Klein models. Expand

Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

- Computer Science, Mathematics
- ICML
- 2018

This work presents a novel method to embed directed acyclic graphs through hierarchical relations as partial orders defined using a family of nested geodesically convex cones and proves that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces. Expand

Adversarial Autoencoders with Constant-Curvature Latent Manifolds

- Mathematics, Computer Science
- Appl. Soft Comput.
- 2019

This work introduces the CCM adversarial autoencoder (CCM-AAE), a probabilistic generative model trained to represent a data distribution on a CCM, and is the first unified framework to seamlessly deal with CCMs of different curvatures. Expand

Auto-Encoding Variational Bayes

- Mathematics, Computer Science
- ICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand

Discrete Variational Autoencoders

- Mathematics, Computer Science
- ICLR
- 2017

A novel method to train a class of probabilistic models with discrete latent variables using the variational autoencoder framework, including backpropagation through the discrete hidden variables, which outperforms state-of-the-art methods on the permutation-invariant MNIST, Omniglot, and Caltech-101 Silhouettes datasets. Expand

Categorical Reparameterization with Gumbel-Softmax

- Mathematics, Computer Science
- ICLR
- 2017

It is shown that the Gumbel-Softmax estimator outperforms state-of-the-art gradient estimators on structured output prediction and unsupervised generative modeling tasks with categorical latent variables, and enables large speedups on semi-supervised classification. Expand