10 August 2011

HOW WE CHANGE OUR MINDS



There's a wonderful article in the NYTimes Book Review, called The Mathematics of Changing Your Mind. I was hooked in the first paragraph ~ "When the facts change, I change my mind. What do you do, sir?" ~ John Maynard Keynes.

The author, Sharon Bertsche McGrayne, introduces us to the philosophical theorem of Thomas Baynes, a theorem which addresses the elemental question: "How do we modify our beliefs in the light of additional information? Do we cling to old assumptions long after they've become untenable, or abandon them too readily at the first whisper of doubt? Baynesian reasoning promises to bring our views gradually in line with reality and so has become a valuable tool for scientists of all sorts and, indeed, for anyone who wants .... to sync up with the universe. If you are not thinking like a Baynesian, perhaps you should be.

"At its core, Baynes' theorem depends upon an ingenious turnabout ~ if you want to assess the strength of your hypothesis given the evidence, you must also assess the strength of the evidence given your hypothesis. In the face of uncertainty, a Baynesian asks three questions ~ How confident am I in the truth of my initial belief? On the assumption that my original belief is true, how confident am I that the new evidence is accurate? And whether or not my original belief is true, how confident am I that the new evidence is accurate?"

Paraphrasing the thought process, the revised probability of a hypothesis is equal to the product of (a) the original probability of the hypothesis and (b) the conditional probability of the evidence given the hypothesis, all divided by (c) the probability of the new evidence.

Sounds a little obscure, right? Not when you think it through. Consider an initial hypothesis ~ "if I flip a coin, the probability that it will land showing heads is 50 percent." But what if I happen to toss the coin five times in a row, and get five heads? Should I revise the probability of heads to 100%? No. We require a much large sample size (number of tosses) in order to arrive at a meaningful average of the occurence of heads and tails. If we toss the coin several hundred times, we are likely to discover that the incidence of heads is quite close to 50 percent.

Now consider another initial hypothesis ~ "global warming does not exist." Our original assigned probability is more complex, and depends heavily on systematic, convincing evidence such as the presence of human-generated greenhouse gases, as well as on evidence from the geological record. If the new evidence is compelling, we are required to revise the probability of our initial hypothesis .... or not. It is worth noting that "people wedded to their [assumptions] can always try to rescue them from the evidence by introducing all sorts of dodges. Witness die-hard birthers and truthers, for instance."

The discussion and illustrations are provocative, and I encourage you to read the entire article. The concept of belief vs. disbelief reminds me of another discussion ~ The Dignity of Skepticism. "Being a responsible believer requires one to have reasons for one's beliefs. In fact, it seems that having reasons for one's beliefs is a requirement for seeing them as beliefs at all .... We may say that beliefs are supposed to be not only reason-responsive, but reason-reflective. Our beliefs should be based on our evidence and proportional to the force of our evidence. And so, when we hold beliefs, we take ourselves to be entitled to reason to and from them. So beliefs must be backed by reason. Reason backing has a curious pattern, however. Each belief must be backed by reasons. But those backing reasons must themselves be backed by still further reasons. And so on. It seems, then, that every belief must be supported by a long chain of supporting reasons.

"This is a point familiar to anyone who has spent time with children. Why? is a question that can (and often is) asked indefinitely. The child's game of incessantly asking Why? may not be particularly serious, but it calls attention to the fact that, for every belief you hold, you ought to be able to say why you hold it."

The discussion elaborates on the regress problem by describing several variants ~ circular chains of reasons, finite chains of reasons, and infinite chains of reasons ~ and the strengths and weaknesses of each. It also notes belief systems in which reason-backing is (for reasons mostly frail) suspended ~ religious, cultural, or commonsensical beliefs. Long story short, in epistemological terms it pays to be a skeptic, forever testing hypotheses and assumptions, lest one be led down the garden path of fallacy or superstition ~ particularly as new events and new evidence present themselves.

No comments:

Post a Comment