Here is an interesting article from Gaurdian about a research from a group in Univ of Chicago. Basically the claim is that the research is on to a method to analyze various published theories and stack them against experimental data to verify those theories, and more importantly, suggest new theories/hypothesis. This is certainly an interesting are of research I wish to keep an eye on. This also relates back to an earlier article that I found [Earlier Article from Edge.org] about a similar principle. Here is an excerpt from the article
“Computer programs increasingly are able to integrate published knowledge with experimental data, search for patterns and logical relations, and enable new hypotheses to emerge with little human intervention,” they write. “We predict that within a decade, even more powerful tools will enable automated, high-volume hypothesis generation to guide high-throughput experiments in biomedicine, chemistry, physics, and even the social sciences.”..
- More about this here: [The Link]
- Link to earlier article from edge: [The Link]