Monday, April 09, 2012

Is It Possible to Not be a Hack?

I was reading Bryan Caplan defend biases, and I concur. Eleizer Yudkowsky once wrote;
If you first write at the bottom of a sheet of paper, “And therefore, the sky is green!”, it does not matter what arguments you write above it afterward; the conclusion is already written, and it is already correct or already wrong. To be clever in argument is not rationality but rationalization.

He makes a good point, especially about the real purpose of reason (to find truth, not defend beliefs), but I think he goes too far. Most of our conscious thought is rationalization to be sure, and I don't think we can avoid this, nor should we. Michael Gazzaniga's experiments highllight that we do it all the time, because the brain is constantly receiving signals from our non-articulate portions of the brain and trying to make sense of them. Your narrative self is like the press secretary of a large organization that it does not fully understand or control, but whose job it is to always ariculate a reason for why one feels or does something. And your mind is so hard wired for this, you don't even see that you are obviously confabulating, as when split brain patients or patients with bizarre brain lesions make up reasons why they think their mother is an impostor, or their arm belongs to someone else. If you were to question all your inner data feeds you would have no intuition, and then be as dumb as a computer.

So, in a sense we are rationalizing all the time. I don't think this implies we should embrace rationalization, and more than we should embrace being emotional because we are inherently emotional. We should merely be mindful that it is something to be managed, not eliminated, from our thoughts.

Clearly you can guess what someone will write on some daily event based on your knowledge of his or her prejudices. Paul Krugman and Thomas Sowell have worldviews that cause them to filter evidence a particular way. Both think it is an efficient worldview because it is more accurate than others, and they find the other's thoughts bafflingly inconsistent because they are based on some very primitive assumptions that are not shared. For better or worse, we all have a style, which is called a rut if it is unproductive. Mozart, HL Mencken were and Douglas Hofstadter is predictable in good ways.

It would be impossible to eliminate one's prejudices, because this is then merely a prejudice (e.g., everything is random and nothing integrates). Better to keep your deepest assumptions semi-private because if you say a key to my worldview is X, it becomes harder to change X because no one likes to be seen as fickle on their principles, it hurts one's credibility. That's why it is useful not to state them too strongly, repeat them too much, or try to serve them with every thought; you want to practice thinking without them, which should be easy given there are lots of useful frameworks existing simultaneously.

Think of your prejudices as something to manage. For example, cognitive therapy is the one therapy that does as well as the SSRIs, and it is based on changing one's thoughts based on rational evaluation, in that if you can see the irrationality of your depressive thoughts, you do not think them as much or as readily. Rationality can change your deep beliefs.


Mercury said...

“Finding truth” may be the real purpose of reason but expressing truth isn’t necessarily the best way to advance human self interest at any particular moment…and humans often favor near-term self-interest.

There’s a reason why the press secretary and the policy maker aren’t the same guy and I think humans employ two different mental “modes” similarly. So, the language and habit of rationalization directed towards others (here’s why you should date me!) isn’t -and shouldn’t be- the same as that which is used for more private, introspective decision making (should I make this investment?, why doesn’t this thing work?) where “finding truth” is much more critical. Sometimes these social and personal modes overlap quite a bit and sometimes they don’t – in which case it might not be a good idea to advertise that incongruence as EF touched on. Sometimes it’s best to keep your mouth shut but always you shouldn’t lie to yourself. At the risk of sounding like a total sociopath I think this all describes pretty normal behavior and is part of the human condition (Oh no! – did I say that out loud?!).

Generally (see: truth vs. self-interest), prejudice and the ability to “profile” seems most valuable not as a hard rule for in-depth analysis but as a tool to help tip the odds in your favor when you need to make a decision with very limited information and/or time to process it (dogs that look/behave like that tend to bite). As the time scale and/or information set become greater, prejudice becomes less effective and accurate for making good decisions and turns into a lazy man’s crutch (I would never hire a California liberal). What you consider to be truth or at least a “settled issue” (this arm is my arm) isn’t properly “prejudice” -even if it’s wrong, you have to take some things for granted- and functions differently in decision making.

I suppose Krugman could be right but I think (I’m profiling here!) it’s more likely that he’s just the kind of person who confuses self-interest with truth, lies to himself and doesn’t know when to keep his mouth shut.

Tel said...

My brain space is finite, my lifespan is finite. I already spent a lot of years very open minded exploring a lot of new things but when I hit 40 I recognised that I'm probably only going to be productive until I'm 60, so in effect 2/3 of the game is over for me. I accept that I'm prejudiced but I feel I've earned that already, and I can't afford the time and energy to waste with low-percentage / high-cost options.

Also, there's a lot out there that is complete and utter crap. Throwing away rubbish quickly and efficiently is the key to a good search algorithm.

Anonymous #5 said...

One important reason that everyone is a hack is that the relationship between "ought" and "is" is fundamentally subjective, but it's almost impossible to think that way. Everyone feels that they are being objective.

People will say that you can't derive "ought" from "is," but of course your morality should be connected to your facts. On the other hand, if you think the world should be different from the way it currently is, you have to go beyond the facts.

It may seem one way around this is to accept the world as it is -- to reject utopianism, to be hardheaded, to recognize strong constraints on what is possible. But no one actually thinks this way *in general*. Everyone feels that some portion of existing society is illegitimate, wrongheaded, suboptimal. And when it comes to their pet peeves, people almost always think that it's possible that the world could be different.

People who oppose changes that you support are slaves to their dysfunctional ideology. People who support changes that you oppose are utopian idealists. And thus everyone in the world is a hack.

Gerard said...

Your comparison of Paul Krugman with Thomas Sowell makes me think you filter information in a particular way.