If you’re casting around for something to do while you finish voting in the Atheist round of the Ideological Turing Test and await the beginning of the Christian one, you should take a look at Less Wrong’s new open call.
The Yudkowsky outfit has been brainstorming exercises for its in-development rationality curriculum. They post a cognitive Skill of the Week (SotW), and ask for ideas about how people could build up good habits. Defeating a bad heuristic has two steps: noticing you’re using it, and switching to a better framework.
A recent SotW was “Be Specific!” People tend not to be good at identifying or describing the crucial attributes of an object, belief, or hypothesis, which makes it a lot harder to carry on a conversation or design tests to see if the belief pays rent. In the SotW post, the example of failure was a team pitching a startup that couldn’t explain exactly how their product was better than a competitor. Here’s one of the rationalist drills I thought of:
Monday/Tuesday Game
On Monday, your proposition is true. On Tuesday, your proposition is false. Tell me a story about each of the days so I can see how they are different. Don’t just list the differences (because you’re already not doing that well). Start with “I wake up” so you start concrete and move on in that vein, naming the parts of your day that are identical as well as those that are different.
If the Less Wrong folks like your idea and decide to field test it, you’ll get $50. If it does well in the field testing and they decide to add it to their curriculum, you’ll get an additional $500. The SotW they’re currently soliciting suggestions for is “Avoid Motivated Cognition.” (You might be more used to seeing ‘motivated cognition’ described as ‘rationalization.’)
I know a number of you also follow LessWrong or are reading Harry Potter and the Methods of Rationality, so you guys might be particularly interested in this opportunity. But even if you’re not familiar enough with the approach over there to offer suggestions for drills, you should probably be interested in the SotW posts, because you’re also limited by these flawed heuristics.
If you offer a suggestion on Less Wrong, please cross post it here.