Archive for the ‘Information Field Theory’ Category

Information field theory

Sunday, January 27th, 2013

Information field theory

From the webpage:

Information field theory (IFT) is information theory, the logic of reasoning under uncertainty, applied to fields. A field can be any quantity defined over some space, e.g. the air temperature over Europe, the magnetic field strength in the Milky Way, or the matter density in the Universe. IFT describes how data and knowledge can be used to infer field properties. Mathematically it is a statistical field theory and exploits many of the tools developed for such. Practically, it is a framework for signal processing and image reconstruction.

IFT is fully Bayesian. How else can infinitely many field degrees of freedom be constrained by finite data?

It can be used without the knowledge of Feynman diagrams. There is a full toolbox of methods.

It reproduces many known well working algorithms. This should be reassuring.

And, there were certainly previous works in a similar spirit. See below for IFT publications and previous works.

Anyhow, in many cases IFT provides novel rigorous ways to extract information from data.

Please, have a look! The specific literature is listed below and more general highlight articles on the right hand side.

Just in case you want to be on the cutting edge of information extraction. 😉

And you might note that Feynman diagrams are graphic representations (maps) of complex mathematical equations.

NIFTY: Numerical information field theory for everyone

Sunday, January 27th, 2013

NIFTY: Numerical information field theory for everyone

From the post:

Signal reconstruction algorithms can now be developed more elegantly because scientists at the Max Planck Institute for Astrophysics released a new software package for data analysis and imaging, NIFTY, that is useful for mapping in any number of dimensions or spherical projections without encoding the dimensional information in the algorithm itself. The advantage is that once a special method for image reconstruction has been programmed with NIFTY it can easily be applied to many other applications. Although it was originally developed with astrophysical imaging in mind, NIFTY can also be used in other areas such as medical imaging.

Behind most of the impressive telescopic images that capture events at the depths of the cosmos is a lot of work and computing power. The raw data from many instruments are not vivid enough even for experts to have a chance at understanding what they mean without the use of highly complex imaging algorithms. A simple radio telescope scans the sky and provides long series of numbers. Networks of radio telescopes act as interferometers and measure the spatial vibration modes of the brightness of the sky rather than an image directly. Space-based gamma ray telescopes identify sources by the pattern that is generated by the shadow mask in front of the detectors. There are sophisticated algorithms necessary to generate images from the raw data in all of these examples. The same applies to medical imaging devices, such as computer tomographs and magnetic resonance scanners.

Previously each of these imaging problems needed a special computer program that is adapted to the specifications and geometry of the survey area to be represented. But many of the underlying concepts behind the software are generic and ideally would just be programmed once if only the computer could automatically take care of the geometric details.

With this in mind, the researchers in Garching have developed and now released the software package NIFTY that makes this possible. An algorithm written using NIFTY to solve a problem in one dimension can just as easily be applied, after a minor adjustment, in two or more dimensions or on spherical surfaces. NIFTY handles each situation while correctly accounting for all geometrical quantities. This allows imaging software to be developed much more efficiently because testing can be done quickly in one dimension before application to higher dimensional spaces, and code written for one application can easily be recycled for use in another.

NIFTY stands for “Numerical Information Field Theory”. The relatively young field of Information Field Theory aims to provide recipes for optimal mapping, completely exploiting the information and knowledge contained in data. NIFTY now simplifies the programming of such formulas for imaging and data analysis, regardless of whether they come from the information field theory or from somewhere else, by providing a natural language for translating mathematics into software.

Your computer is more powerful than those used to develop generations of atomic bombs.

A wealth of scientific and other data is as close as the next Ethernet port.

Some of the best software in the world is available for free download.

So, what have you discovered lately?

NIFTY is a reminder that discovery is a question of will, not availability of resources.

NIFTY – Numerical Information Field Theory Documentation, download, etc.

From the NIFTY webpage:

NIFTY [1], “Numerical Information Field Theory”, is a versatile library designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency.

NIFTY offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTY permits its user to rapidly prototype algorithms in 1D and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTY operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those.

I first saw this at: Software Package for All Types of Imaging, with the usual fun and games of running down useful links.

Information Field Theory

Wednesday, December 7th, 2011

Information Field Theory

May be something, may be nothing.

I saw a news flash about the use of this technique to combine 41,000 observations to create a magnetic map of the Milky Way. Subject to a lot of noise and smoothing of the data.

Which made me think that perhaps, just perhaps this technique could be used across a semantic field?

From the webpage:

Information field theory (IFT) is information theory, the logic of reasoning under uncertainty, applied to fields. A field can be any quantity defined over some space, e.g. the air temperature over Europe, the magnetic field strength in the Milky Way, or the matter density in the Universe. IFT describes how data and knowledge can be used to infer field properties. Mathematically it is a statistical field theory and exploits many of the tools developed for such. Practically, it is a framework for signal processing and image reconstruction.

All the examples I found were in the physical sciences but I would check closely before claiming to be the first to use the technique in a social science context.