Monday, October 26, 2009

Provoking Thought: Moral Judgments in Objective Situations

I saw a preview for an upcoming movie on TV the other day. The movie was The Box, which is the upcoming Cameron Diaz movie, that's based on an old Twilight Zone episode, "Button, Button." Basically, the premise is this. Some stranger comes by, and has a little box, with a red button inside. Sort of like what you'd imagine a missile launch button in the Oval Office would look like.


See? That was easy. Now give me my million bucks. Or launch the nuke. Either way.

Now, the trick is, if you hit the button, you magically get a million bucks. The apparent drawback is that someone will die as a result. My first (and current) instinct is to start mashing buttons, like I was playing Nintendo. Think the old Track and Field game, where the faster you push the buttons, the faster your guy runs. In the old Twilight Zone episode, the button killed someone you didn't know. And after they push the button, the mysterious stranger informs the newly rich, button-pushing murders that he was going to give the button to "someone they didn't know." And that's supposed to horrify the button-pusher and the audience. Given my reaction, maybe it's not quite the moral dilemma that the writers of The Twilight Zone would have us believe it is. At least, it isn't for someone with my shaky track record on human rights. My working plan is to basically shoot the guy who came by the house with the button, keep the button, pretend the button is a bongo drum, and start printing the cash.

But what's interesting is how this device is supposed to capture our imaginations. In this country, we seem to be wired such that we demand not only justice in outcome (distributive justice), but also outcome in process (procedural justice). Now, the case of The Box, we're supposed to be somewhat offended at both the process (too whimsical, trading wealth for life), but also the outcome (one's gain at the expense at a presumably innocent party). Now what intrigues me is the idea that how we think about this problem quickly moves from a mindset of problem solving into a mindset of evaluating morality.

I've mentioned some research on intuition before in the Board Room done by one of my friends. One of his latest works is a book chapter on intuition, where he discusses the difference between intuition in problem solving and intuition in morality. Essentially, the differences boil down to two dimensions. The first is the level of affect at play (affect is basically equal to we describe emotion in common language). When we use intuition in problem solving, we generally have a low level of emotion involved. However, when we use intuition as, "an input in making moral decisions," it tends to be an emotionally charged process. The other interesting thing is in the evaluation process itself. When we are in a problem solving mode, our intuition is based on "very specific, domain-based knowledge." Essentially, we base our evaluations on our knowledge about the situation at hand. On the other hand, when we are making moral evaluations, our intuition is based on "moral prototypes," essentially examples that morality.

Now what's interesting is that when we try to evaluate the morality of a situation, we use information that is inferior for making objective decisions. Basically, the presence of affect is generally to be distracting toward problem optimization. And the use of a prototype (which is culturally negotiated and rather dynamic) can easily lead to faulty conclusions. Basically, we are much worse at using moral intuition than we are at using our problem solving intuition. And perhaps more interestingly, despite the use of this inferior information, we are actually more entrenched with our moral evaluations than we are with our more analytical ones.

It breaks down something like this. When we see someone commit a dishonest act, regardless of the number of honest acts this person has done, we use this prominent example, and make a moral judgment about him. And future honest acts have relatively less weight in changing our opinion. Now, compare this to an evaluation of someone's intelligent. When someone does something dumb, we generally don't label them as a dumb person, if we have seen other evidence that suggests that they are intelligent. And after a dumb action, relatively fewer intelligent acts would cause us to change our evaluation of this person. To compound things further, when we have a negative perception of a moral situation (i.e., I think this person is dishonest), it influences our objective evaluations (i.e., since I think this person is dishonest, I believe that he is less likely to be intelligent).

So why does this matter? A number of reasons. Certainly, we see the presence of suboptimal judgments when we attempt to apply morality in situations that call for brute analysis. Additionally, people may confuse the need for a problem solving with the desire for a moral judgment. And just as scary, we may not know when someone is acting based on a moral judgment or on an analytical one, so our understanding of human behavior could be off. Think about many of the hot-button topics that are in play right now. Things like gay marriage, abortion, environmentalism, educational policy, human rights, etc. And think about how people evaluate those situations. Often you see a fall back to the morality of the issue, and rather than the discussion of objective facts, the discussion hinges on exemplars or the prototypical images that pop into our heads.

Over at IJAB there are a couple posts that are touching on potentially controversial issues, including vaccination, drugs and gang violence, etc. One particular comment on the topic of vaccination comes from Robby (our very libertarian friend from previous discussion, such as this one), who notes that, "...the most vocal people against the vaccine are distinctly anti-science. They repeatedly ignore any and all legitimate research focusing mostly on some single event that happened to them personally." Robby's observation of everyday behavior is precisely what is predicted when people try to apply a moral judgment. People use inferior information, often with an exemplar/prototype, and have an emotionally-charged thought process. You could follow up his statement by then describing how people will use that exemplar as a start point, and then try to build an argument on that basic foundation.

Now what's interesting is that immediately, you see the Anonymous poster take offense at Robby's description by 1) claiming that they are a scientist, 2) claiming that there is evidence to prove a point, and 3) taking an emotionally-charged stance. Again, the suspicion is that Anonymous is trying to mask their moral judgment by claiming that it was an analytical one.

As you read through the commentary, you can probably guess where I stand. Again, given my shaky track record on human rights, it's not surprise that I agree with the Anonymous poster that we should stop giving out the vaccine, though I suspect that we agree on the course of action for very different reasons. Sadly, not enough people respond to my comments. My friend JK noted, "I start to get this nice discussion on my blog, until you post, and amazingly, the conversation stops."

I'll take that as a compliment.

-Chairman

10 comments:

Greg McConnell said...

My working plan is to basically shoot the guy who came by the house with the button, keep the button, pretend the button is a bongo drum, and start printing the cash.

Classic.

Chairman said...

Yeah. What can I say? That's how we roll.

Greg McConnell said...

Basically, we are much worse at using moral intuition than we are at using our problem solving intuition. And perhaps more interestingly, despite the use of this inferior information, we are actually more entrenched with our moral evaluations than we are with our more analytical ones.

I had never really considered this before (at least not in this way). I'm finding myself generally agreeing with the above passage.

It breaks down something like this. When we see someone commit a dishonest act, regardless of the number of honest acts this person has done, we use this prominent example, and make a moral judgment about him. And future honest acts have relatively less weight in changing our opinion. Now, compare this to an evaluation of someone's intelligent. When someone does something dumb, we generally don't label them as a dumb person, if we have seen other evidence that suggests that they are intelligent. And after a dumb action, relatively fewer intelligent acts would cause us to change our evaluation of this person. To compound things further, when we have a negative perception of a moral situation (i.e., I think this person is dishonest), it influences our objective evaluations (i.e., since I think this person is dishonest, I believe that he is less likely to be intelligent).

My thought is that some people are much better at making the above distinctions than others (i.e., knowing when to overlook one dishonest act or noticing a trend; or accurately guaging a person's intelligence and what situations they will excel in). This is an area where a person can really separate him- or herself from the pack and gain an advantage by better understanding the true environment in which they operate.

So why does this matter? A number of reasons. Certainly, we see the presence of suboptimal judgments when we attempt to apply morality in situations that call for brute analysis. Additionally, people may confuse the need for a problem solving with the desire for a moral judgment. And just as scary, we may not know when someone is acting based on a moral judgment or on an analytical one, so our understanding of human behavior could be off. Think about many of the hot-button topics that are in play right now. Things like gay marriage, abortion, environmentalism, educational policy, human rights, etc. And think about how people evaluate those situations. Often you see a fall back to the morality of the issue, and rather than the discussion of objective facts, the discussion hinges on exemplars or the prototypical images that pop into our heads.

Here you point out the potential drawbacks. But I think there are actually many benefits to human nature being as you've described above regarding moral intuition. My question to you is this: If tomorrow everyone in the world started using "brute analysis" to define their moral views, what would happen?

Chairman said...

Of course there are benefits of this morality-based mode of thinking, otherwise we wouldn't be doing it. The drawback comes when we apply it to situations where it isn't beneficial. And I agree that there are individual differences in a) whether one prefers to think morally or analytically, and b) how good one is at each.

In my post, I am generally talking about situations where objective analysis is worth much more, and consequences of missing out on moral issues is relatively less important. However, think about a situation, where there are dire consequences if you misjudge the morality of a situation. Similarly, there are instances where the usage of a prototype is a better path. In these instances, the use of moral intuition should result in a superior outcome. For example, if there is a murderer on the loose, the consequences of erring with our moral judgments is very, very high. And, when we are looking at deviant behavior, prototypical examples of dangerous deviant behavior are more useful than brute analysis.

More practically, if you are the man holding the box with the magical red button, and you come to my house, expecting me to either hit the button or return it to you, you may be better off considering that my morality is deviant.

Greg McConnell said...

I see your points. My thought is this: Humans, by nature, like (and even need) to belong to a group. And what I hadn't considered before is that humanity's tendency to get entrenched in its various moral evaluations is a key component to keeping a group together. (Think Cubs fans. Heheh.)

Chairman said...

Greg - that's a nice point, and speaks a lot to the notion of whether people are more prone to individualism (or independent) versus more collectivism (or interdependent).

When you deal with topics like orthodoxy and fundamentalism (the the outcomes of these things, such as terrorism or genocide), you see how some of these social factors play a role in how individuals process information, affecting their attitudes and behavior.

Perhaps this is why emotionally charged negative responses are seen, when someone like A-Rod (who is a prototype of the enemy, if you're a non-Yankees fan) "steals signs" compared to Joe Mauer (who is a feel-good story) doing the same.

Abraham Sangha said...

"...the difference between intuition in problem solving and intuition in morality" Problem-solving is moral. Morality is problem-solving. Your premise seems like a rehash of the tired "science vs. faith/emotion/existential needs" distinction.

Chairman said...

Abe - how are you, my friend? What are you up to nowadays?

I would suggest that we have a couple issues that we may not be on the same page with. I'd agree that it's similar in flavor to the science vs. faith discussion. But the focus here is on individual thinking, and builds upward. Essentially, this differentiation that I discuss (sadly, this isn't my original framework - I'm only smart enough to think about ways to use the framework) looks at two different ways people use a similar process, intuition. This has a lot of implications, because of the differences in the types of information that people seek, as well as the actual processing of that information. The point isn't to see whether science or faith is more true than the other. It's to examine the things that result in completely different interpretations of the same information.

People are predispositioned to think in a certain way. And then the interaction with the environment will further affect how their thinking goes. Wouldn't it be interesting to figure how to get the folks who always think "science", to consider "faith" (or vice versa), simply be framing problems differently, changing the style in which information is presented, or moving them to subtle changes in how the process that information?

In my mind, this wasn't a science vs. faith, macro-level thought. This was a very micro-level, individual difference in thinking that I was curious about.

Robby said...

I had the feeling there was a learning point in that exchange...

Chairman said...

Robby - I suppose. Though it's more likely that people just walk away more convinced than ever that they're right and that the people around them are idiots.