Morality in making decisions
Young, an associate professor of psychology at Boston College, recently won a prestigious national award for early-career scientists for her work studying moral decision-making.
Q. What does it mean to study moral decisions?
A. I study human moral decision-making and behavior - both the psychological processes that support moral judgment and also the neural basis of moral judgment. So, what are the brain regions that help us make moral judgments and that help us think about other people and what they’re doing?
Q. One problem you research is how people respond to the question of accidents versus intentional acts: So, did someone accidentally put poison instead of sugar into their friend’s coffee, or did they do it on purpose? And psychopaths respond differently to this kind of situation than than do other people?
A. Psychopaths are highly forgiving of accidents. Ordinary people find it really difficult to forgive accidents in some cases, because you can’t deny the fact that harm was caused. If someone sent you a computer virus and it messed up your computer, even though you know they did it completely by accident, you still have a really hard time forgiving them, because of this gut emotional response. Psychopaths have a blunted emotional response to the pain and suffering of victims even of accidents. That results in them being especially lenient in these cases.
Q. What does this work tell us about ourselves?
A. It doesn’t just matter what people do but why they do it and what’s going on in their heads when they do it.
Q. And there’s a region in our brains that controls this kind of moral judgment?
A. We all have the intuition that mental states matter - that what people mean to do matters. It’s rooted in a specific brain region: the right temporoparietal junction, right above and behind the right ear. We can temporarily disrupt activity in that region. Doing so also disrupts the ability to use that brain region for moral judgments, so people end up basing their moral judgment on what agents actually do rather than what they mean to do. It’s really striking that something as complicated and high-level as moral judgment could be systematically affected by turning off particular switches in the brain.
Q. Is there any evolutionary purpose to morality? A reason our ancestors developed morality?
A. I think there are evolutionary functions to the many different kinds of moral concerns we have. Purity norms originally evolved to prevent [people] from eating rotten meats or from contaminating themselves and maybe committing incest. A lot of the purity norms might give rise to moral intuitions about things that originally evolved to keep us healthy and safe.
Q. Is it tricky to study issues of morality while working at a Catholic institution?
A. I don’t think so. I haven’t run into any challenges, nor do I expect to. I am curious to see whether or not political or religious leanings play into some of the fundamental intuitions that we’re studying.
Q. Do you see any moral distinctions between liberals and conservatives?
A. Some colleagues and I are looking at whether political conservatives and political liberals have different intuitions about who or what matters. Are there reliable, political differences between conservatives and liberals? That’s work in progress.
Q. Are these differences the basis for some of our interpersonal, political, or geopolitical conflicts?
A. A lot of moral conflict comes about when different people subscribe to different moral codes. Harm concerns and purity concerns seem to matter differently to different groups of people. Maybe understanding the psychology behind these norms, and when and how they are in opposition, we start to understand why people disagree so much when it comes to morality.
Interview has been edited and condensed. Karen Weintraub can be reached at firstname.lastname@example.org.