Computational Information Geometry by Frank Nielsen.
From the homepage:
Computational information geometry deals with the study and design of efficient algorithms in information spaces using the language of geometry (such as invariance, distance, projection, ball, etc). Historically, the field was pioneered by C.R. Rao in 1945 who proposed to use the Fisher information metric as the Riemannian metric. This seminal work gave birth to the geometrization of statistics (eg, statistical curvature and second-order efficiency). In statistics, invariance (by non-singular 1-to-1 reparametrization and sufficient statistics) yield the class of f-divergences, including the celebrated Kullback-Leibler divergence. The differential geometry of f-divergences can be analyzed using dual alpha-connections. Common algorithms in machine learning (such as clustering, expectation-maximization, statistical estimating, regression, independent component analysis, boosting, etc) can be revisited and further explored using those concepts. Nowadays, the framework of computational information geometry opens up novel horizons in music, multimedia, radar, and finance/economy.
Numerous resources including publications, links to conference proceedings (some with videos), software and other materials, including a tri-lingual dictionary, Japanese, English, French, of terms in information geometry.