The End Of Theory?

Chris Anderson, the editor in chief of Wired Magazine, wrote last week an article that you find at the Edge proclaiming


Anderson claims that our progress in storing and analyzing large amounts of data makes the old-fashioned approach to science – hypothesize, model, test – obsolete. His argument is based on the possibility to analyze data statistically with increasing efficiency, for example online behavior: “Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.”

This, he seems to believe, makes models entirely unnecessary. He boldly extends his technologically enthusiastic future vision to encompass all of science:

“Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical
speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.

Now biology is heading in the same direction... ”

Examples he provides rely on statistical analysis of data. It doesn’t seem to occur to him that this isn’t all of science. It strikes me as necessary to actually point out the reason why we develop models is to understand. Fitting a collection of data is part of it, but we construct a model to gain insight and make progress based on what we have learned. The point is to go beyond the range in which we have data.

If you collect petabytes over petabytes about human behavior or genomes and analyze them running ever more sophisticated codes, this is certainly useful. Increasingly better tools can indeed lead to progress in varios areas of science, dominantly in those areas that are struggling with huge amounts of data and that will benefit a lot from pattern recognition and efficient data classification. But will you ever be able to see farther than others standing on the shoulders of a database?

If you had collected a googol of examples for stable hydrogen atoms, would this have lead you to discover quantum mechanics, and all the achievements following from it? If you had collected data describing all the motions of stars and galaxies in minuscule details, would this have lead you to conclude space-time is a four-dimensional continuum? Would you ever have understood gravitational lensing? Would you ever have been able to conclude the universe is expanding from the data you gathered? You could have assembled the whole particle data booklet as a collection of cross-sections measured in experiments, and whatever you do within that range you could predict reasonably well. Bould would this have let you predict the Omega minus, the tau, the higgs?

Anderson concludes

“The new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world. Correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all.”

With data analysis only, we might be able to discover hidden knowledge. But without models science can not advance beyond the optimal use of available data – without models the frontiers of our knowledge are set by computing power, not by ingenuity. Making the crucial step to identify a basic principle and extend it beyond the current reach is (at least so far) an entirely human enterprise. The requirement that a model be not only coherent but also consistent is a strong guiding principle that has pointed us into the direction of progress during the last centuries. If Anderson’s “kind of thinking is poised to go mainstream,” as he writes, then we might indeed be reaching the end of theory. Yet, this end will have nothing to do with the scientific method becoming obsolete, but with a lack of understanding what science is all about to begin with.

PS: I wrote this while on the train, and now that I am back connected to the weird wild web I see that Sean Carroll wrote a comment earlier with the same flavor, so did Gordon Watts. John Horgan wrote about problem solving without understanding, and plenty of other people I don't know added their opinion. This immediate resonance indeed cheers me up. Maybe, science will have a chance. Leaves me wondering whether writing articles that cross the line of provocation to nonsense is becoming fashionable.


See also: Models and Theories.

0 Response to "The End Of Theory?"

Post a Comment

Entri Populer

wdcfawqafwef