Today was my second day of “The Nature of Belief” conference at UCSB, where I took three bright young philosophy students, one of whom has applied to several UC and CSU undergrad programs for the fall, and who would have a chance to meet faculty for a couple of them, experience his first conference, and get a better idea of how much his head would explode as a philosophy major at a uni. So they had a great time, grinning from ear to ear alternating with WTF looks. But they were down to talk after each session with me and the speakers and the grad students and faculty.
Cognitive Scientist Eric Mandelbaum from CUNY gave the last paper today, a counter to Bayesian Imperialism in favor of a more complex, but empirically demonstrable, explanation of how and whether we change our beliefs in the face of disconfirming (disproving) evidence in some cases that Bayes can’t explain. I won’t bore or confound you with the details of these theories and experiments, but the upshot was illustrated clearly by two cases:
The first related the history of the mapping of the Arctic in the 1860s and 70s. Hilariously dubbed by our speaker as an “Armchair Cartographer” Augustus Petermann was convinced the gulf stream extended up Baffin Bay past Greenland through the Arctic all the way to the Bering Sea. So in 1871 the Polaris went up, got stuck in the ice, crew killed the captain, drifted, got rescued and came back and said nope, just a bunch of ice. Petermann who had never been there said, that can’t be right. 1875-6 HMS Discovery went up, got stuck, everyone got scurvy, came back and said nope, no way through. Petermann doubled down and said, having still never been, that they were wrong. 1879-81, the USS Jeannette went up, got stuck for a long time, half died and the rest came back and said, nope. Then Petermann killed himself.
The second illustration is of a long list of cults that predicted the end of the world. Invariably the dates arrived after much preparing and proselytizing and nothing happened. So then each cult on this list doubled down on the prediction, they just got the date wrong, and then all but one became MORE popular after that.
This paper reminded me of Cass Sunstein’s 2009 “Going to Extremes: How Like Minds Unite and Divide” in which he draws a similar conclusion that our speaker and his colleagues have. In most cases in the face of cognitive dissonance, a person will ignore the fact that their belief has been falsified, and instead of changing their views, believe them with even more strength of confidence.
Mandelbaum called this the “belief disconfirmation paradigm” and suggested it might be a hard wired tool of a possible human “psychological immune system” to keep us from freaking out. Wishful thinking as evolutionary fitness. Well maybe it is, but then had the following thought, which was kind of related to the preamble to the talk that “everyone is terrible”, and that was that wishful thinking helps until it doesn’t.
Well, at least there were fun and interesting students to chat with on the way home. They were mind boggled but even more sure that they love philosophy.