Sunday, November 25, 2007

Efficient Market Hypothesis

Suppose that you have a logical argument, which seems compelling to you, that publicly available information has not been reflected in an asset's price. (One example might be here, otherwise I'm sure you can pick out a different argument that has occurred to you at some point.) If you have funds to invest, should you focus investment funds in that area? I would argue, generally no, because of Copernican (majoritarian) considerations, including various forms of the Efficient Market Hypothesis.

If, instead, you have a partially Ptolemaic viewpoint, and are logically consistent, you would probably come to the conclusion that, any time you see everyone else make what seems to you like a logical mistake, you should spend significant effort in determining how you can profit from the mistake.

For example, suppose you believe that, with probability p, you are now in a 'privileged epistemological position' that will increase your expected annual returns, from 1.06 to 1.10, if you actively (rather than passively) manage you portfolio. (But with probability 1-p, there is no such thing as a privileged epistemological position. If you actively manage, but there is no such thing as a privileged position, your expected returns go down to 1.05 because of transaction costs.) If your probability p is above 0.2, you would want to actively manage rather than passively manage.

The problem with active management, of course, is that in the existing market, for every winner there must be a loser. So there's a "meta-level" where the above must be, on average, bad advice. It's not clear to me how to consistently avoid these types of traps without recourse to a Copernican Epistemology.

Saturday, November 10, 2007

Pure Copernican epistemologies

There are multiple people in the room, including you, who (even after discussion of the objective facts) all have different honest ("Experience") beliefs. You have to make a correct decision, based on those beliefs. Consider four algorithms to make the decision.

1. Always base your decision on your own ("Experience") beliefs.

2. Always go with the beliefs of whoever you judge has the most "common sense" in the room (which, by staggering coincidence, happens to be you.)

3. Always go with the beliefs of whoever's Social Security Number is 987-65-4320 (which, by staggering coincidence, happens to be your own Social Security Number.)

4. Everyone takes some sort of Good Decision-Making Test that measures your general GDM (Good Decision-Making) ability.

The first two are clearly not "Copernican Epistemologies", as they posit as axioms that you have 'privileged access' to truth. If you wish to adopt a purely Copernican Epistemology, you would reject (1) and (2). Would you have a preference between (3) and (4)? Both (3) and (4), on the face, Copernican. But the decision of what algorithm to choose make your current decision is, itself, a decision! If you apply a Copernican process to that decision, and so on recursively, you would (in theory) eventually come back to some small set of consistent axioms, and would reject (3).

I personally believe that a normative "Theory of Everything" epistemology would have to be purely Copernican, rather than partially Ptolemaic. To elaborate, it would have to be an epistemology where:
  1. There are a relatively small set of axioms (for example, there is no room for axioms that directly reference Social Security Numbers)
  2. None of these axioms explicitly reference yourself as a privileged source of knowledge, with the exception that I would allow some privileged access to your own current consciousness, and your own current thoughts and beliefs. (You do not have privileged access to your past feelings, thoughts, and beliefs; you have to infer those from your current thoughts and beliefs, like everyone else.) To be clear, this privileged access would not be of the form "I have privileged knowledge that my belief about X is correct," but rather, of the form "I have privileged knowledge that I know that 'I believe X is correct.' In contrast, I don't know whether Joe believes that 'X is correct'; he says he does, but for all I know, he's deliberately lying."

Saturday, November 3, 2007

Some hypothetical answers for the Wire Disagreement Dilemma

Here is a sampling of possible answers for the Wire Disagreement Dilemma:

1. Always go with your own "Experience" beliefs (cut the red wire).

2. Always go with the beliefs of whoever's Social Security Number is 987-65-4320 (which, by staggering coincidence, happens to be your own Social Security Number.)

3. Always go with the beliefs of whoever you judge has the most "common sense" in the room (which, by staggering coincidence, happens to be you.)

4. Always go with the majority belief.

5. Always go with the belief of the person who had the highest IQ test scores on his most recent test.

6. Always go with the person with the most education (as measured in years of schooling).

7. Assign a score, based on a preconceived formula that weights one or more of the previous considerations. Then go with whoever has the highest score, unless you really dislike the outcome, it which case go with your "Experience" beliefs.

Ptolemaic vs. Copernican Epistemologies. One of the differences between these solutions is the degree to which they presuppose that you have privileged access to the truth. For lack of a better term, I would call systems Copernican Epistemologies if they posit that have no privileged access to the truth, and Ptolemaic Epistemologies if they posit that you do have privileged access to the truth. This is a spectrum;
"Always go with your own 'Experience' beliefs" is the exemplar of Ptolemaic belief; "I have no privileged 'Experience' beliefs" is the exemplar of Copernican belief; there are plenty of gradients between.

Note that it is not possible for a human to actually implement a 100% pure Ptolemaic belief system, nor a 100% pure Copernican belief system. For example, your beliefs of "what I would have believed, apart from other peoples' opinions" will, in practice, be tainted by your knowledge of what other people believe.