Some days ago I asked how we can determine whether we really love the truth or not. Bryan’s Caplan’s account of preferences over beliefs and rational irrationality indicates there may be an additional impediment to answering this question correctly, besides the factors mentioned in the first post. I may care more or less about the truth about various issues, especially depending on how they relate with other things I care about. Now consider the difference between “I have a deep love for the truth,” and “I don’t care much about the truth.”
For most people, the former statement is likely to appear attractive, and the latter unattractive. Let’s suppose we are trying to determine which one is actually true. If the first one is true, then we would care about the truth about ourselves, and we would make a decent effort to determine the truth, presumably arriving at the conclusion that the first is true (since it is true by hypothesis.)
But suppose the second is true. In that case, we are unlikely to make a great effort to determine the actual truth. Instead, we are likely to believe the more attractive opinion, namely the first, unless the costs of believing this are too high.
In principle, believing that I have a deep love for truth when in fact I do not could have a very high cost indeed. But in practice this would be by a very circuitous route, and frequently the costs would not be immediate or apparent in any way. Consequently someone who does not care much about the truth is likely to believe that he does care a lot, and is only likely to change his mind when the costs of his error become apparent, just like the person who becomes uncertain when he is offered a bet. Under normal circumstances, then, most people will hold the first belief, regardless of whether the first or the second is actually true.