If the Jehovah’s Witnesses are impatient with respect to truth, why do they nonetheless manage to advance in the knowledge of truth?
Our story about Peter, as a morality tale, is a bit more absolute than reality often is. Disordered behavior will more often than not produce disordered consequences, but the details will vary from case to case. Most cases of impatient driving do not in fact result in death, and most of the time the driver will still in fact get to his destination. There may however be other bad consequences, as the unnecessary annoyance and inconvenience posed to other drivers, the growth of the driver’s bad driving habits, and so on.
In a similar way, impatience with respect to truth will tend to have bad consequences, but the details will vary from case to case. In most cases those consequences may include detrimental effects relative to the knowledge of truth, but they will not necessarily completely impede the knowledge of truth, just as bad driving does not necessarily prevent one from reaching the destination.
In the case of the Witnesses, we noted that their progress seems laughably slow. It would be reasonable to attribute this slowness to the impatience in question, while the general fact of progress can be attributed to the general causes of such progress.
Impatiently adopting an excessively detailed view will slow a person’s advance in truth in a number of ways. In the first place, such a view will very likely be false, just as the detailed predictions of the Witnesses turned out to be false. And falsehood of course impedes the knowledge of truth first by excluding the truth opposite to the falsehood. Likewise, falsehood impedes the knowledge of truth in other ways, because when we learn anything, we learn it in the context of everything else that we know. Insofar as what we think we know includes some things that are untrue, these untrue aspects will tend to distort our view of the new things that we are learning.
Second, there are particular effects of jumping to untrue conclusions that are excessively detailed. Suppose I say, “There will be a nuclear war beginning on March 3rd, 2017.” If I claim to possess a high level of confidence about this, then I must claim an even higher level of confidence that there will be a major war in 2017, and a still higher confidence that there will be one or more major disasters in the next few years. This is for the reason discussed some days ago, namely that the more general claims must be more known and more certain, and as a matter of probability theory, the numerical probability assigned to the more general claim must be higher than that assigned to the more specific claim.
Now suppose, as is likely, that no nuclear war begins on March 3rd, 2017. What will I conclude? It might be reasonable, in some sense, for me to conclude that I was mistaken about the more general things as well, and not only about the date of March 3rd. But I was very, very confident about the more general things, significantly more so than about the date of March 3rd. And given that my original assignment of March 3rd proceeded from impatience for specific knowledge, a more likely result is that I will now say that the war will begin on September 17th, or something like that. And even after this does not happen, I will be quite likely to say, “Well, maybe I was wrong about the details. But there is still likely to be a major war before the end of the year, or anyway in the next few years.” And this will be because of my greater certainty about the more general claims. And this greater certainty itself arose from my impatience for specific knowledge, not from a careful analysis of the facts.