Last modified: 2012-09-12 (finished). Epistemic state: log.

Did the first half of my epistemic tag update. I wanted to add more fine-grained distinctions, but I had a hard time figuring out how to rank a metaphysical belief. The main problem of accurate belief assignments is that it depends on a good prior. But even getting a well-defined prior (i.e. one that isn’t zero on the best answer, which is typically by making it non-zero everywhere) requires you to know what the hypothesis-space even is.

Say you’ve only heard of consequentialism and nihilism, and nothing else. You hear the arguments and assign consequentialism a 60% probability and nihilism 40%. Now Kant comes along and blows your mind with deontology, but unfortunately you’ve already used up all the probability. Dammit, so you switch over to likelihood ratios. Consequentialism is at least as likely as deontology, you think, and each is twice as likely as nihilism, so it’s 2:2:1. Fine, if something better comes along, you can easily add it. But now how can you express how certain you are of consequentialism? You’d have to say “I give consequentialism 2:3 odds, compared to everything else I’ve heard”, but for that to be meaningful, you’d also have to tell me what else you’ve heard. Ain’t nobody got time for that!

Compression sucks. But wait a minute - I’m a Yangming-ite, I believe in the unity of knowledge and action. Correct beliefs must result in virtue1 and vice versa. So why not just express that?

So now for certain things I believe in, to some degree, instead of saying how much I do believe in it, I’m saying what kind of duel I would accept over it. This separates hipster beliefs from Serious Business. If you challenge me, and you win, I concede the belief and accept your position. If I win, I expect you to do the same.

For an important but still highly speculative belief, I accept fitness challenges, like who can do the most push-ups in one week. Sure, I may easily lose those, but they sound like fun and I win either way (by becoming more awesome).

For something more serious, I’d fight you in Quake 3 like a man. I require up to 3 months of preparation for those, though.

For the true hardcore stuff, I’d accept a fight to the death. Up to 1 year of preparation, we decide on a time, place and weapon, and at end of the day, at most one of us is still standing. That person is deemed right.

Yes, I’m completely serious. No, there is no hardcore belief yet. (In public anyway.) More specific rules are outlined on the Epistemic State page. I’ve yet to re-tag old pages. (That’s the second half.)

(HT to Will for the idea.)

May I comment my work? AAAAH. That pretty much sums up this week.

  1. Yangming only made the argument in case of moral beliefs and actions, but I think it should extended to the general case, but then I also think morality should be extended to swallow everything. (Eventually.) I suspect (but I’m not sure) that Yangwing would agree with that.

blog comments powered by Disqus
dlog » daily log » coup de poing