Sunday, June 22, 2008

Reviews of books that changed the reviewer's mind

A nonfiction book reviewer will generally review a book on two aspects:

1. How many interesting, new, useful, and accurate ideas are presented in the book?

2. Is the book's thesis accurate and righteous? In other words, does the book fit in well with the reviewer's pre-reading worldview?

My goal is generally not to replace my own prejudices and worldview with the prejudices and worldview of the reviewer, so I would like to seek out books that meet criterion #1. If i had a thousand concurrent additional lifetimes right now, I think Rolf #428 would write up a list of book reviews where the reviewer claimed that the book substantively changed his mind about a topic, to the extent that the reviewer reversed a previously-endorsed opinion. This would produce a list of books that are more likely to meet criterion #1: the books would be more likely to have been recommended by the reviewer because the content of the book was compelling, rather than because the book affirmed pre-existing beliefs.

One caveat would be that an intensification of a previously-held belief would not count. For example, a reviewer saying "I believed before that Bush was a mediocre president, but now i realize he's the worst man who ever lived!" would not count as a reversal. In addition, a shift from an unpopular belief to a popular belief would not weigh as heavily as a shift from a popular to an unpopular belief.

Sunday, June 8, 2008

The reluctance to give probabilities (II)

The last of the three theories I gave in the last post, ties into a notion I have of what we mean by "admissible" vs. "inadmissible" evidence when evaluating what the "true" probability is of an event. There is nothing in the external objective world that corresponds to directly to what we term "admissibility"; the question of admissibility seems to be a part of our evolved social judgment kit.

I think reluctance to commit is a (possibly evolved) mechanism targeted specifically to combat other peoples' Hindsight Bias. It fills in a predictive gap in the other theories, the fact that your "weight" on the coin toss, is, in a sense, 0: once the coin is tossed, you have no reason to doubt someone who tells you it came up heads! And yet, people feel no hesitation about assigning it a .5 probability, even after the coin-flip has already been made, because they know nobody will doubt their competence for doing so.

Here are some different ways that predictions can be framed.

1. I have no idea who will win. (Extremely tentative)

2. I give a 50% chance the wrestler will win. (Less tentative)

Similarly, if the only thing you know is that wrestlers train on a wider variety of maneuvers than boxers, and therefore you gave a slight edge to the wrestler:

1. It seems a little bit more likely the wrestler will win. (Extremely tentative)

2. I give a 60% chance the wrestler will win. (Less tentative)

3. I give a 60.020001% chance the wrestler will win. (Not at all tentative)

(2) is more tentative than (3), maybe because of the norms of parsimony in conversation. It's not clear why (2) is much more tentative than the (1); the norm that you only give a numeric percentage when you're willing to stand behind your estimate,

Note that newspapers rarely give probabilities of events, aside from weather forecasts that can be "justified" by pointing to a large ensemble of past data and uncontroversial analysis.