Showing posts with label disagreement. Show all posts
Showing posts with label disagreement. Show all posts

Sunday, February 24, 2008

Disagreement Checklist

If you seem to disagree with someone about a factual statement, it's important to consider all four possible explanations:

1. The other person's conscious beliefs are incorrect.

2. Your conscious beliefs are incorrect.

3. The other person is deliberately deceiving you about his conscious beliefs.

4. You have the same conscious beliefs, but your understanding of what the other person is trying to say is incorrect, and you perceive a factual disagreement where none exists.

Obviously more than one of these can apply to a given situation.

Sunday, February 10, 2008

Majoritarianism and the Efficient Market

Posit four cases where your opinion differs from the majority:

1a. You are considering what checkout counter to go to at the supermarket; there are no lines. You are about to go to counter A, where the cashier looks like he'd be faster, when you are told that 3/4 of the people who shop here use counter B instead when there are no lines.

1b. The checkout counters have long lines; counter B's line is three times as long as counter A. The line at counter B looks to be moving only twice a fast, so you feel inclined to use counter A.

2a. You are, bizarrely, forced to make a 50-50 bet on whether Sun Microsystems will make a profit this year. You are about to bet "yes", when you are informed that 3/4 of the people offered this bet said "no".

2b. You are forced to either sell short or buy long on Sun Microsystem's stock. You are about to buy long when you are informed that three times as many people have shorted the stock as have gone long on it.

Philosophical Majoritarianism suggests you consider changing your mind in cases 1a and 2a, but not 1b and 2b. The reason is that, to the degree that Majoritarianism is valid, it doesn't matter which you line you pick in 1b, or whether you buy or sell in case 2b; the majority decision, if wise, has washed out the difference, so Majoritarianism doesn't provide direct guidance on what to do.

Sunday, November 25, 2007

Efficient Market Hypothesis

Suppose that you have a logical argument, which seems compelling to you, that publicly available information has not been reflected in an asset's price. (One example might be here, otherwise I'm sure you can pick out a different argument that has occurred to you at some point.) If you have funds to invest, should you focus investment funds in that area? I would argue, generally no, because of Copernican (majoritarian) considerations, including various forms of the Efficient Market Hypothesis.

If, instead, you have a partially Ptolemaic viewpoint, and are logically consistent, you would probably come to the conclusion that, any time you see everyone else make what seems to you like a logical mistake, you should spend significant effort in determining how you can profit from the mistake.

For example, suppose you believe that, with probability p, you are now in a 'privileged epistemological position' that will increase your expected annual returns, from 1.06 to 1.10, if you actively (rather than passively) manage you portfolio. (But with probability 1-p, there is no such thing as a privileged epistemological position. If you actively manage, but there is no such thing as a privileged position, your expected returns go down to 1.05 because of transaction costs.) If your probability p is above 0.2, you would want to actively manage rather than passively manage.

The problem with active management, of course, is that in the existing market, for every winner there must be a loser. So there's a "meta-level" where the above must be, on average, bad advice. It's not clear to me how to consistently avoid these types of traps without recourse to a Copernican Epistemology.

Saturday, November 10, 2007

Pure Copernican epistemologies

There are multiple people in the room, including you, who (even after discussion of the objective facts) all have different honest ("Experience") beliefs. You have to make a correct decision, based on those beliefs. Consider four algorithms to make the decision.

1. Always base your decision on your own ("Experience") beliefs.

2. Always go with the beliefs of whoever you judge has the most "common sense" in the room (which, by staggering coincidence, happens to be you.)

3. Always go with the beliefs of whoever's Social Security Number is 987-65-4320 (which, by staggering coincidence, happens to be your own Social Security Number.)

4. Everyone takes some sort of Good Decision-Making Test that measures your general GDM (Good Decision-Making) ability.

The first two are clearly not "Copernican Epistemologies", as they posit as axioms that you have 'privileged access' to truth. If you wish to adopt a purely Copernican Epistemology, you would reject (1) and (2). Would you have a preference between (3) and (4)? Both (3) and (4), on the face, Copernican. But the decision of what algorithm to choose make your current decision is, itself, a decision! If you apply a Copernican process to that decision, and so on recursively, you would (in theory) eventually come back to some small set of consistent axioms, and would reject (3).

I personally believe that a normative "Theory of Everything" epistemology would have to be purely Copernican, rather than partially Ptolemaic. To elaborate, it would have to be an epistemology where:
  1. There are a relatively small set of axioms (for example, there is no room for axioms that directly reference Social Security Numbers)
  2. None of these axioms explicitly reference yourself as a privileged source of knowledge, with the exception that I would allow some privileged access to your own current consciousness, and your own current thoughts and beliefs. (You do not have privileged access to your past feelings, thoughts, and beliefs; you have to infer those from your current thoughts and beliefs, like everyone else.) To be clear, this privileged access would not be of the form "I have privileged knowledge that my belief about X is correct," but rather, of the form "I have privileged knowledge that I know that 'I believe X is correct.' In contrast, I don't know whether Joe believes that 'X is correct'; he says he does, but for all I know, he's deliberately lying."

Saturday, November 3, 2007

Some hypothetical answers for the Wire Disagreement Dilemma

Here is a sampling of possible answers for the Wire Disagreement Dilemma:

1. Always go with your own "Experience" beliefs (cut the red wire).

2. Always go with the beliefs of whoever's Social Security Number is 987-65-4320 (which, by staggering coincidence, happens to be your own Social Security Number.)

3. Always go with the beliefs of whoever you judge has the most "common sense" in the room (which, by staggering coincidence, happens to be you.)

4. Always go with the majority belief.

5. Always go with the belief of the person who had the highest IQ test scores on his most recent test.

6. Always go with the person with the most education (as measured in years of schooling).

7. Assign a score, based on a preconceived formula that weights one or more of the previous considerations. Then go with whoever has the highest score, unless you really dislike the outcome, it which case go with your "Experience" beliefs.

Ptolemaic vs. Copernican Epistemologies. One of the differences between these solutions is the degree to which they presuppose that you have privileged access to the truth. For lack of a better term, I would call systems Copernican Epistemologies if they posit that have no privileged access to the truth, and Ptolemaic Epistemologies if they posit that you do have privileged access to the truth. This is a spectrum;
"Always go with your own 'Experience' beliefs" is the exemplar of Ptolemaic belief; "I have no privileged 'Experience' beliefs" is the exemplar of Copernican belief; there are plenty of gradients between.

Note that it is not possible for a human to actually implement a 100% pure Ptolemaic belief system, nor a 100% pure Copernican belief system. For example, your beliefs of "what I would have believed, apart from other peoples' opinions" will, in practice, be tainted by your knowledge of what other people believe.

Sunday, October 28, 2007

Wire Disagreement Dilemma

You are locked in a room with two other people and a time bomb. To disarm the bomb, you must choose correctly between cutting the red wire or the blue wire on the bomb; cutting the wrong wire, or failing to cut either of the wires in time, will trigger the bomb. Any one of the three of you can choose to lunge forward and cut one of the wires at any time.

Each of you puzzles over the circuit-wiring schematic. You find an airtight, 100% certain proof that the red wire is the wire that needs to be cut. But simultaneously, your two allies report that they have come up with airtight, 100% certain proofs the blue wire needs to be cut! You cannot come to a consensus, either because you do not have time, or because you simply cannot understand each others' proofs.

Your choices are:

1. Lunge forward and cut the red wire.

2. Allow your allies to cut the blue wire.

How do you make your decision? Call this the Wire Disagreement Dilemma.

Notes:

1. According to the most straightforward application of classical logic, you should lunge forward and cut the red wire.

2. Philosophical Majoritarianism doesn't tell you exactly what to do. PM seems to be a heuristic that you use alongside other, sometimes conflicting, heuristics. As I've seen it outlined, it doesn't seem to tell you much about when the heuristic should be used and when it shouldn't.

3. There's a sense in which you never have an actual proof when you make a decision, you only have a memory that you had a proof.

4. Consider two people, Alice and Bob. Alice should not automatically give her own beliefs "magical precedence" over Bob's beliefs. However, there are many circumstances where Alice should give her own beliefs precedence over Bob's; there are also circumstances where Alice should defer to Bob.

5. This type of thinking is so rare, that (to my knowledge) we don't even have a short word to describe the difference between "I believe X because I reasoned it out myself" and "I believe X because someone smarter or more experienced than me told me X, even though, on my own, I would have believed Y."

In normal conversation, you have to use cumbersome phrases and idioms: for example, "it seems to me like X" in the former case and "my understanding is that X" in the latter case.

Experience vs. Hearing: As technical terms, I'd propose that in the former case we say "I Experience X" or "my Experience is X." In the latter case we can say "I Hear that X" or "my Hearing is X."

6. One asymmetry, when Alice is evaluating reality, is that she generally knows her own beliefs but doesn't necessarily know Bob's beliefs. Bob may be unavailable; Bob may be unable to correctly articulate his beliefs; Alice may misunderstand Bob's beliefs; there may not be time to ask Bob his beliefs; or Bob may deliberately deceive Alice about his beliefs.