When I was writing my dissertation in 1994, I had to avoid any 'behavioralist' explanation because these were considered ad hoc. This really stymied me because my main finding that low volatility stocks had higher than average returns didn't fit into any rational framework. Now the pendulum is on the other side, and behavioralist explanation need hardly any support more than an anecdote or introspection to motivate an empirical finding. We are obviously irrational, emotional, and use predictably irrational heuristics to make sense of the world. The question remains, are these explanations helpful, or simply ex post rationalizations? As Eugene Fama notes, half the behavioralist findings suggest overextrapolation (representativeness bias), half underextrapolation (anchoring bias), the average being 'rational'.
This subject was highlighted in a Bloggingheads TV post between Eliezer Yudkowsky and Razib Kahn. They ponder whether people are more problematically stupid or crazy. I'd go with crazy. Stupidity is a problem, but the really harmful ideas come from the smart people who believe in bad ideas, and the most imaginative people are the most credulous, because for them everything is possible if we just implement their master plan.
A lot of psychotherapy, if not philosophy, is based on letting go of emotions (the Serenity Prayer, Zen Buddhism), because anxiety makes us feel bad. As Hamlet said, 'There is nothing either good or bad, but thinking makes it so'. It's a nice thought, that we have nothing to fear but fear itself. Yet fear, and anxiety, do have a function. I find it interesting that children are learning at a much faster rate than adults, and one of the most distinguishing characteristics of children is the are much more emotional than adults, with more frequent and stronger mood swings; little kids cry a lot, adults almost never (curiously, I currently am only tempted to tear up during movies). I have met many very smart people, and they are often quite emotional. They are prone to paranoia, conspiracies, feeling either too proud or too ashamed of themselves at times. I don't think this correlation is an accident.
I think if you look closely at any thoughtful person, you will find attributes in their personality that border on some DSM disorder. The only people I know really well who seem totally normal are rather shallow, they have no great emotions that drive them to really interesting opinions. As Abraham Lincoln noted, 'It has been my experience that folks who have not vices have very few virtues.'
Neuroscientist Antonio Damasio studied people who had received brain injuries that had had one specific effect: to damage that part of the brain where emotions are generated (see David Brooks talking about it here). In all other respects they seemed normal - they just lost the ability to feel emotions. The interesting thing Damasio found was that their ability to make decisions was seriously impaired. Ask what restaurant they want to go to, they would dither, and consider an infinite number of considerations but never choose. They are lacking the emotion that allows them to mark things as good, bad, or indifferent, so without any emotional state they just see more data, often with conflicting implications. In particular, many decisions have pros and cons on both sides, and without emotions, the net result is ambiguous.
Shall I have the fish or the chicken? With no emotions, people are unable to make the decision.
I think this is related to the Ellsberg's paradox, which is the observation that people prefer to bet they will find a 'blue' ball if there are 50 blue and 50 red balls in an urn, as opposed to one where there's a 50% chance an urn contains 100 blue balls, and a 50% chance an urn contains 100 red balls. In the latter, the information is kind of incomplete, one can imagine others know whether the urn has blue or red balls, and is so arbitraging you. In the former, with a known 50% blue frequency, you are safe. Similarly, if your information set is incomplete, because you have not exhausted the state space of potentially informative data, you are potentially making a blatantly suboptimal choice. As shown in the paper 'Fact-Free Learning', looking for the set of inputs that maximizes an R2 is computationally hard. The problem is so pervasive that it has been canonized in literature: Sherlock Holmes regularly explains how the combination of a variety of clues leads inexorably to a particular conclusion, to which Watson exclaims, “Of course!” Once you know the answer, it's obvious, and it's scary to think an obvious explanation is out there, but you were merely too lazy to find it.
So, without a little built-in anxiety, a little homunculus who hates dithering, we would 'rationally' sift through all the data forever, because the state space we are searching for is not well defined, and it could provide an 'aha!' moment. But our whiney anxiety impulse pushes us to decide, which is a good thing, because we have finite lives, and many things to do.
I really enjoyed you last few posts.
Full text of paper 'Fact-Free Learning' referenced in the post: http://cowles.econ.yale.edu/P/cd/d14b/d1491.pdf
I definitely agree that, as we are, we need emotions to make decisions, but I think it's an accident of evolution. There is no a priori reason why a person with damaged emotion-generating part of the brain couldn't choose between two restaurants. After spending some time gathering data, they could just decide that, given what they've learned, they're indifferent between them, flip a coin and be done with it.
przemek: that's what logical people think one should do, but strangely, only logical people with emotions. For some reason, this doesn't seem a good solution to those without emotions.
On this topic, I really liked "How we decide" (http://www.amazon.com/How-We-Decide-Jonah-Lehrer/dp/0618620117).
On the other hand, I just finished reading "Finding Alpha", and I want to say : Amazing. (Funny thing, being a finance book it opend my eyes to classic american literature)
Hi Eric, I liked your post quite a bit until the end, till the very last paragraph, because I think - with some unknown degree of emotional influence - you've overstated a couple of conclusions. However, I haven't read your sources so I could be missing something. On the other hand, anxiety might just as easily lead to indecision, as well as rational thinking could, and research aside, I think, feel and otherwise know, that's pretty certain.
ilene:well, clearly I'm speculating. But as noted by one commenter, it isn't logical to keep dithering, so there's something in our emotion that keeps that from happening. Anxiety, however, like all things, may be useful in moderation.
Maybe you're saying it's all complexly interrelated and I can go with that.
Post a Comment