(Image from: http://www.wired.com/images/article/magazine/1607/pb_intro_f.jpg)
This data explosion has huge implications for organisations that need to process this data into meaning information. Chris Anderson, editor in chief of Wired, believes that this means that the scientific method is becoming increasing impractical.
Well, I tend to agree with this. As we say in the semiconductor industry, if you can't model the damn circuit, simulate the hell out of it.
But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on. (Link)