# nLab information metric

### Context

#### Measure and probability theory

measure theory

probability theory

# Contents

## Idea

In information geometry, a (Fisher-)information metric is a Riemannian metric on a manifold of probability distributions over some probability space $X$ (the latter often assumed to be finite).

## Definition

On a finite probability space $X\in$ Set a positive measure is a function $\rho :X\to {ℝ}_{+}$ and a probability distribution is one such that ${\sum }_{x\in X}\rho \left(x\right)=1$.

This space is actually a submanifold of ${ℝ}_{\ge 0}^{\mid X\mid }$. For $\left\{\frac{\partial }{\partial {x}^{i}}\right\}$ the canonical basis of tangent vectors on this wedge of Cartesian space, the information metric $g$ is given by

$g\left(\frac{\partial }{\partial {x}^{i}},\frac{\partial }{\partial {x}^{j}}\right)\left(\rho \right)=\frac{1}{\rho \left({x}^{i}\right)}{\delta }_{ij}\phantom{\rule{thinmathspace}{0ex}}.$g(\frac{\partial}{\partial x^i}, \frac{\partial }{\partial x^j})(\rho) = \frac{1}{\rho(x^i)} \delta_{i j} \,.

## References

• L. L. Campbell, An extended Čencov characterization of the information metric Journal: Proc. Amer. Math. Soc. 98 (1986), 135-141. (AMS)

Created on June 17, 2011 17:48:10 by Urs Schreiber (89.204.137.105)