# Information geometry

In mathematics,**information geometry**is the study of probability and information by way of differential geometry. It reached maturity through the work of Shun'ichi Amari in the 1980's, with what is currently the canonical reference book:

*Differential-geometrical methods in statistics*.

Information geometry is based primarily on the Fisher information metric:

*i = -ln(p)*from information theory, the formula becomes:

Thus, if a point in information space represents the state of a system, then the trajectory of that point will, on average, be a random walk through information space, i.e. will diffuse according to brownian motion.

With this in mind, the information space can be thought of as a fitness landscape, a trajectory through this space being an "evolution". The brownian motion of evolution trajectories thus represents the no free lunch phenomena discussed by Stuart Kauffman.

### Natural Gradient

An important concept in Information Geometry is the natural gradient. The concept and theory of the natural gradient suggests an adjustment to the energy function of a learning rule. This adjustment takes into account the curvature of the (prior) statistical differential manifold, by way of the Fisher information metric.This concept has many important applications in blind signal separation, neural networks, artificial intelligence, and other engineering problems that deal with information. Experimental results have shown that application of the concept leads to substantial performance gains.

## References

- Shun'ichi Amari -
*Differential-geometrical methods in statistics*, Lecture notes in statistics, Springer-Verlag, Berlin, 1985 - Shun'ichi Amari, Hiroshi Nagaoka -
*Methods of information geometry*, Transactions of mathematical monographs; v. 191, American Mathematical Society, 2000

*This article is a stub. You can help Wikipedia by expanding it.*