Tuesday

"The Half-Life of Facts," by Samuel Arbesman

In Samuel Arbesman's expression, set out in his new book "The Half-Life of Facts: Why Everything We Know Has an Expiration Date," we dwell in uncertainty. Things around us, things we think we know, are always changing. Sometimes it's because we learn more - many of Arbesman's examples come from our increasing knowledge of dinosaurs (they didn't have feathers or colors, and barely had brains, when I was a kid). Sometimes we get better at measuring - look at what happened to the size of Pluto (it is smaller than we were taught and is now no longer thought to be a planet). The rapid pace of change can feel confusing, if not overwhelming. How can we possibly keep up?

In his clear and well-written book Arbesman tells us how to approach the problem. He demonstrates how knowledge changes in regular ways, ways that are systematic enough for us to understand. He argues that understanding the changes helps us make sense of the world and allows us to prepare for the changes we can anticipate. (I think these are what have been called in another context the known unknowns.) From my blog's perspective, it's a well-written, useful book about the importance of thinking critically.

Arbesman uses the work of John Ioannidis to remind us that one study is never enough, and that we are prone to jumping to conclusions. Just because something has been published doesn't mean that it is true, or will hold up over time. In fact, chances are, it won't. (As in most other statistical situations, that means the aggregate of papers, because you can't at any particular time tell which ones will last.) Replication is the best way to test scientific findings, but replicating someone else's work doesn't mean prizes or even necessarily publication. Here are a few common-sense corollaries that Arbesman provides to keep in mind when you read or hear about new studies:
  • The smaller the studies conducted in a scientific field, the less likely the research findings are to be true.
  • The smaller the effect sizes in a scientific field, the less likely the research findings are to be true.
  • The greater the number and the lesser the selection of tested relationships in a scientific field, the less likely the research findings are to be true.
  • The greater the flexibility in designs, definitions, outcomes, and analytical modes in a scientific field, the less likely the research findings are to be true.
  • The greater the financial and other interests and prejudices in a scientific field, the less likely the research findings are to be true.
  • The hotter a scientific field (with more scientific teams involved) the less likely the research findings are to be true.
It's equally important to think critically about the smaller things in our lives too. We communicate through networks, of both the interpersonal and the electronic kind. Unfortunately, we also transmit errors that way. Errors, Arbesman tells us, often spread more slowly than truths, but they also linger. His solution is one well worth adopting: be critical before you push send or post or otherwise share that scary story. (He reminds us that there are a couple of websites that provide the research, including snopes.com and xkcd.com, so that we don't have to worry about debunking each myth that comes our way.)
There's a lot more going on in this entertaining and interesting book. If you think, as I did, that the title is a metaphor, well, nope. Growth in human knowledge is exponential, though different fields have different rates of increase. New discoveries prove old ones wrong, and what we know changes. There are a couple of concomitants to this information. First, it's getting harder to make discoveries (though contiguity with one's collaborators makes everyone's work better.) Second, over time, most scientific papers will be superseded or shown to be wrong. The turnover is true for central tenets as well as for details.

You may have heard of Moore's Law, the one that says that the capacity of a computer chip doubles every 12-18 months. In a chapter titled "Moore's Law of  Everything" Arbesman generalizes that tenet to growth in many areas. The interplay between science (what we know) and technology (what we can do) depends on the growth of knowledge. But it also depends on the growth of the human population: more population means more knowledge. Of course, he adds, the people need to be interested in and able to become scientists and engineers, and we need to be able to communicate what we have learned. Sometimes knowledge in one field stays hidden from experts in another, but we are beginning to understand the mechanisms that allow the systematic exploitation of one area's learning in another. We often study what interests us, what we like, or what’s easier to discover. And cognitive biases can interfere with our ability to understand what is right under our noses.

This is not a book of philosophy or statistics but a very good effort at making some useful work accessible. Do you agree? Let me know what you think in the comments.

No comments:

Blog Archive

Popular Posts