Friday, October 26, 2012

Signal, Noise and Prediction

I'm now going to turn to Nate Silver's new book, The Signal and the Noise: Why So Many Predictions Fail-but Some Don't. Silver is the well-known political forecaster who parlayed his blog FiveThirtyEight into a prominent spot in the New York Times.

I didn't expect much when I bought it. I thought it would be one of those "my quantitative model explains the universe" books (and investment funds) which are so tiresome and common. The world, and especially the markets, are filled with quants who think all you need is Mathematica and some back issues of Econometrica to explain everything. They are usually overconfident, expert on code rather than decisions,and tend to blow up spectacularly like LTCM given time.

Nothing could be further from the truth in this case. The book massively exceeded expectations and turns out to be a thoughtful, mature and reflective. It is consistent with much of my experience and thinking, but I still learned a lot of things I didn't know. It's also fluent and well-written. I'd recommend it without hesitation, and I'll look at it in some detail.

The crux of the book, from someone known for his number-crunching models, is that there is no such thing as objective data-driven models, at least in human affairs.

The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning. Like Caesar, we may construe them in self-serving ways that are detached from their objective reality. Data-driven predictions can succeed—and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.

The more information we have, the more we tend to screen out that which does not match our preconceptions. More information most often makes us narrower rather than wiser.

Alvin Toffler, writing in the book Future Shock in 1970, predicted some of the consequences of what he called “information overload.” He thought our defense mechanism would be to simplify the world in ways that confirmed our biases, even as the world itself was growing more diverse and more complex.

Information is no longer scarce, but much of it is not very useful.

Our biological instincts are not always very well adapted to the information-rich modern world. Unless we work actively to become aware of the biases we introduce, the returns to additional information may be minimal—or diminishing.

This does not mean we should just give up, or adopt lazy relativism. Instead, everything is approximate.

Some of you may be uncomfortable with a premise that I have been hinting at and will now state explicitly: we can never make perfectly objective predictions. They will always be tainted by our subjective point of view. But this book is emphatically against the nihilistic viewpoint that there is no objective truth. It asserts, rather, that a belief in the objective truth—and a commitment to pursuing it—is the first prerequisite of making better predictions. The forecaster’s next commitment is to realize that she perceives it imperfectly.

So what are the causes of failure to predict outcomes?

The most calamitous failures of prediction usually have a lot in common. We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.

Indeed, experts have a particular tendency to ignore threats to their expertise. The rating agencies, for example, did not think through the possibility that default risk of various CDOs and CDO tranches might not be independent and uncorrelated.

The possibility of a housing bubble, and that it might burst, thus represented a threat to the ratings agencies’ gravy train. Human beings have an extraordinary capacity to ignore risks that threaten their livelihood, as though this will make them go away.

Our expectations about the future are riddled with blind spots, as anyone who has ever really thought about the policy process or had to predict events for a living - and been held accountable for it - knows.

We'll look at some other aspects of the book in more detail.

 

No comments:

Post a Comment