This may or may not be the title of a forthcoming, posthumously published book by David Foster Wallace, but I’m interested in it in the context of the following passage:
What our intelligence system really needs is ways to avoid becoming trapped by the natural tendency to leap to conclusions and stick with them. This is true in other fields as well, which is why so much of professional and scientific training is designed to reduce the errors made by fallible people using weak information. (“Think Different, CIA”)
My graduate training is decidedly non-professional and perhaps marginally scientific, but I don’t view this as a signal feature of professional or scientific training. Anyway, don’t lots of people in the CIA have advanced (“professional”) degrees? (See the inevitable Ed Vul for more on this in the context of interpreting neuroimaging results. The short, predictable answer is: The pros disagree on what’s allowable.)
It’s also interesting that this article relies pretty heavily on behavioral science findings without citing Dan Ariely, Dan Gilbert, Danny Kahneman, or any of the other Dans, Dannys, Daniels, and people with other sorts of names who do decision-making research. (Actually, the bit about recognizing the significance of absences is reminiscent of Philip Johnson-Laird’s work on mental models — the general idea being that people inflate the salience of positive events in their minds, which has an impact on judging things like the probability of an event or the set of circumstances that satisfy a set of conditions.) I’m generally not fussed about intellectual attribution in the popular press, but I’m vaguely worried that these findings are now so permeating the noosphere that they’re becoming conventional wisdom rather than the positions of specific researchers. That might be a little much to adduce from one article, though.