A Big Data Revolution in Astrophysics by Ian Armas Foster.
Ian writes:
Humanity has been studying the stars for as long as it has been able to gaze at them. The study of stars has to led to one revelation after another; that the planet is round, that we are not the center of the universe, and has also spawned Einstein’s general theory of relativity.
As more powerful telescopes are developed, more is learned about the wild happenings in space, including black holes, binary star systems, the movement of galaxies, and even the detection of the Cosmic Microwave Background, which may hint at the beginnings of the universe.
However, all of these discoveries were made relatively slowly, relying on the relaying of information to other stations whose observatories may not be active for several hours or even days—a process that carries a painful amount of time between image and retrieval and potential discovery recognition.
Solving these problems would be huge for astrophysics. According to Peter Nugent, Senior Staff Scientist of Berkeley’s National Laboratory, big data is on its way to doing just that. Nugent has been the expert voice on this issue following his experiences with an ambitious project known as the Palomar Transient Factory.
It’s a good post and is likely to get your interested in astronomical (both senses) data problems.
Quibble: Why no links to the Palomar Transient Factory? Happen too often at many sites for this to be oversight. We are all writing in hyperlink capable media. Yes? Why the poverty of hyperlinks?
BTW:
Palomar Transient Factory, and
Access public spectra (WISEASS)
I don’t mind if you visit other sites. I write to facilitate your use of resources on the WWW. Maybe that’s the difference.