When you read pop-nonfic books about, well, how to do things, there seems to be a divide between people who believe that human expertise and intuition is worthwhile, and people who believe that human expertise and intuition introduce more errors than they resolve.
In the latter camp, you have someone like Daniel Kahneman, who includes a startling chapter in his book Thinking, Fast and Slow, on how simple and extremely mechanical diagnostic tests perform better than doctors’ expert knowledge at telling when patients need further care. For instance, when doctors use their intuition and knowledge to determine whether or not a patient’s heart pain is serious, they do a really bad job and, as a result, they end up sending people home to die of heart attacks. Whereas if they use a simple algorithm (that’s based on three quick and easy-to-input measurements), then they perform much better.
Basically, all of the doctor’s expertise isn’t worth that much. It merely clouds the issue by introducing all kinds of inessential and ancillary data. In order to get better results, hospitals actually have to institute systems that reduce doctor’s discretion in handling patient care.
You see this again and again. Study after study shows that people just aren’t that good at doing the things they’re supposed to be good at. Cops aren’t good at telling whether people are lying. HR people aren’t very good at telling, from interviews, which candidates are going to do well. And business executives are terrible at generating value for their shareholders by merging with other companies.
And so the thrust of a number of books that I’ve read (The Halo Effect, The Black Swan, and A Random Walk Down Wall Street) has been that you shouldn’t really trust in peoples’ skills. Instead, you should trust (insofar as you can trust in anything) in evidence-backed systems. The only knowledge that matters is quantifiable knowledge. Anything else is just somebody’s intuition, and their intuition is more likely to be in service to their self-image and ego than the truth.
It’s the same reason that I don’t trust traditional medicine. Not because traditional remedies can’t be good, but because (assuming no one has done a scientific study on the medicine’s efficacy) no one actually knows whether the remedy is good.
Which is not to lionize science, since so much science is also pretty shoddy. For instance, most published research results are false.
Anyway, you already know this stuff. The point I am trying to make is that this way of thinking (distrust experts and their supposed ‘skills’) is so baked into my worldview that I find it strange whenever I come across a book that thinks differently. And now I’ve read two in just two days. The first I posted about yesterday (Flash Boys). The second is Nate Silver’s book The Signal And The Noise. Most of the book is, as one would expect, about how terrible people are at predicting things.
But there are also interesting segments where he does, to a certain extent, talk about how some experts are good at predicting things. For instance, he says that ‘some’ political commentators are good at making predictions. He says that ‘some’ weathermen are good at making predictions. It’s all very fascinating. I’m not sure that I buy his reasoning re: what separates good experts from bad experts. I wonder if maybe the good experts were just lucky…
Part of me wonders if Nate Silver doesn’t occupy some weird ground here. He’s not really a scientist. He’s a predictor. It’s his ‘job’ to be right more often than other people are right. Thus, he does have some vested interest in the concept of expert knowledge….