KCL, Strand
room: Safra Lecture Theatre
abstract: Geometric learning on probability distributions with the Fisher-Rao metric
Information geometry is a differential geometric approach to probability theory and statistics. In this approach, probability distributions are seen as elements of a differentiable manifold, and the Fisher information is used to define a Riemannian metric. The induced geometric tools, such as geodesics, geodesic distances and intrinsic means, have proven useful to interpolate, compare, average or perform segmentation between objects modeled by probability densities.
In this talk, we will give an introduction to geometric learning and information geometry. In particular, we will investigate the Fisher-Rao geometry of beta and Dirichlet distributions, showing that they are negatively curved and geodesically complete. These properties, also shared by other parametric families such as Gaussian distributions, guarantee the existence and uniqueness of geodesics and means. This makes the Fisher-Rao metric a suitable metric, in these parametric families, to use in learning procedures such as K-means clustering. Keywords:
|