The Information geometry reference article from the English Wikipedia on 24-Jul-2004
(provided by Fixed Reference: snapshots of Wikipedia from

Information geometry

See the real Africa
In mathematics, information geometry is the study of probability and information by way of differential geometry. It reached maturity through the work of Shun'ichi Amari in the 1980's, with what is currently the canonical reference book: Differential-geometrical methods in statistics.

Information geometry is based primarily on the Fisher information metric:

Substiting i = -ln(p) from information theory, the formula becomes:

Which can be thought of intuitively as: "The distance between two points on a statistical differential manifold is the amount of information between them, i.e. the informational difference between them."

Thus, if a point in information space represents the state of a system, then the trajectory of that point will, on average, be a random walk through information space, i.e. will diffuse according to brownian motion.

With this in mind, the information space can be thought of as a fitness landscape, a trajectory through this space being an "evolution". The brownian motion of evolution trajectories thus represents the no free lunch phenomena discussed by Stuart Kauffman.

Natural Gradient

An important concept in Information Geometry is the natural gradient. The concept and theory of the natural gradient suggests an adjustment to the energy function of a learning rule. This adjustment takes into account the curvature of the (prior) statistical differential manifold, by way of the Fisher information metric.

This concept has many important applications in blind signal separation, neural networks, artificial intelligence, and other engineering problems that deal with information. Experimental results have shown that application of the concept leads to substantial performance gains.


This article is a stub. You can help Wikipedia by expanding it.