Moral Psychology
From PsychWiki - A Collaborative Psychology Wiki
What is Moral Psychology?
Moral Psychology is the study of how we come to possess our system of moral beliefs as well as the consequences of these beliefs. In Psychology (some people study moral psychology in philosophy as well), we are particularly interested in empirical research along these areas.
What types of morals exist?
Schwartz's Values
Values encompass more than just morals, but some of the earliest empirical research on morals was done by Shalom Schwartz using his Value Inventory (see http://changingminds.org/explanations/values/schwartz_inventory.htm).
Schwartz's values included: Power, Achievement, Hedonism, Stimulation, Self-direction, Universalism, Benevolence, Tradition, Conformity, Security. These values can be classified along the dimensions of self vs. other focused (ie. hedonism vs. benevolence) and change vs. conservation focused (ie. stimulation ve. tradition).
An overview of Schwartz's values framework can be found here -> http://www.fmag.unict.it/Allegati/convegno%207-8-10-05/Schwartzpaper.pdf
Shweder (Community, Autonomy, Divinity)
Richard Shweder, a cultural anthropologist, studied morals in diverse groups and found that despite a wide array of ways that these moral areas could be operationalized, there were 3 main areas of ethics. These ethics are:
from http://www.ceu.hu/legal/ind_vs_state/Shweder_paper_2002.htm
- Community - "based on moral concepts such as duty, hierarchy and interdependency, which is designed to help individuals achieve dignity by virtue of their role and position in a society."
- Autonomy - "based on moral concepts such as harm, rights and justice, which is designed to protect individuals in pursuit of the gratification of their wants."
- Divinity - "based on moral concepts such as natural order, sacred order, sanctity, sin and pollution, which is designed to maintain the integrity of the spiritual side of human nature."
(Rozin, Lowery, Imada, and Haidt 1999) mapped emotions onto Shweder’s 3 ethics by asking students in the United States and Japan to look at moral situations and determine which emotional reactions, represented by pictures of facial expressions, would be appropriate. There was general agreement that contempt was a natural reaction to violations of the community ethic, anger was a natural reaction to violations of the autonomy ethic, and disgust was a natural reaction to violations of the divinity ethic. This was later confirmed by looking at the actual facial reactions of American students to these moral situations.
Haidt (Harm, Fairness, Ingroup, Heirarchy, and Purity)
Reviewing & Synthesizing the work that had come before him, Jonathan Haidt's moral psychology lab at the University of Virginia proposed 5 moral foundations (Harm, Fairness, Ingroup, Heirarchy, and Purity) from which most moral principles can be derived (Haidt & Graham, 2006). These foundations represent a refinement of Shweder's ethics as a parallel can be drawn between Harm/Fairness & Autonomy, Ingroup/Heirarchy & Community, and Purity & Divinity. Harm + fairness/justice correspond to autonomy in this model as they concern the freedom of people to pursue their own happiness and the immorality of interfering/obstructing these pursuits by others. Heirarchy/authority + ingroup loyalty roughly correspond to the ethic of community as they concern the morality that arises from the structure of social situations. Purity and divinity are related as they concern the holiness of the body and the spirit.
What is the relationship between cognition and morals? (Moral Reasoning)
Lawrence Kohlberg built upon the work of Piaget who observed older and younger children at play and interviewed them with respect to short moral scenarios. Piaget observed that younger children conceive of morality in terms of obedience to adults and consequences while older children conceive of morality in terms of cooperation with their peers and the intentions of the moral actor. Piaget posited a 2 stage theory of moral development out of this observation, which Kohlberg later expanded to a 6 stage model.
Kohlberg's 6 stages are:
- 1. Obedience and Punishment Orientation
- 2. Individualism and Exchange
- 3. Good Interpersonal Relationships
- 4. Maintaining the Social Order
- 5. Social Contract and Individual Rights
- 6. Universal Principles
In order to determine the current stage of an individual, Kohlberg conducted interviews based on various moral dilemmas. Below is one such dilemma. Later, James Rest developed a non-interactive version of Kohlberg's interviews called the defining issues test, which is still used in various professional ethics domains.
In Europe, a woman was near death from a special kind of cancer. There was one drug that the doctors thought might save her. It was a form of radium that a druggist in the same town had recently discovered. The drug was expensive to make, but the druggist was charging ten times what the drug cost him to make. He paid $200 for the radium and charged $2,000 for a small dose of the drug. The sick woman's husband, Heinz, went to everyone he knew to borrow the money, but he could only get together about $ 1,000 which is half of what it cost. He told the druggist that his wife was dying and asked him to sell it cheaper or let him pay later. But the druggist said: "No, I discovered the drug and I'm going to make money from it." So Heinz got desperate and broke into the man's store to steal the drug-for his wife. Should the husband have done that?
In stage 1, a person would talk about whether or not they would be punished or whether it is against the law. Morality is external to the person and punishment shows that it is wrong. In stage 2, the idea of fair exchange in introduced in addition to external punishment, so the person may talk about the Druggist being unfair to Heinz by ripping him off or Heinz being unfair to the Druggist who worked at developing the treatment. In stage 3, people talk about the idea of intentions. So Heinz's intention of saving his wife's life justifies his actions. They may also talk about the Druggist versus Heinz's motives. In stage 4, people talk about society as a whole. So they may think of what would happen if all people took actions such as Heinz and dissaprove while still understanding his motives. In stage 5, people begin to think about an ideal society and the tension between an individual's rights and maintaining social order (stage 4). So they may think that breaking the law is wrong for societal harmony, but that a man's duty to his wife trumps societal order. In stage 6, people talk about universal principles that should underly a just society, regardless of it's current form. However, Kohlberg eventually became disillusioned that people could live up to such expectations and eventually decided that this stage was only theoretical.
Some caveats of Kohlberg's work include those of (Krebs & Denton 2005), who found that most people actually exhibit a range of moral reasoning stages with respect to differing dilemmas. As such, the use of specific scenarios may not translate to how people think of morals in the real world (ecological validity). There are also a large number of situational and individual affective differences which effect the stage at which moral reasoning occurs in a given situation. Asking respondents to talk about how they reason in the real world yielded a much larger effect of affect and emotion in moral reasoning than is provided for in Kohlberg's work.
Some of the citations for Kohlberg's work include:
- (Kohlberg, L. 1984). Essays in moral development: Vol. 2. The psychology of moral development. New York: Harper & Row.
- (Kohlberg, L., & Candee, D. 1984). The relationship of moral judgment to moral action. In L. Kohlberg (Ed.), Essays in moral development: Vol. 2. The psychology of moral development (pp. 498–581). New York: Harper & Row.
- (Krebs, D., & Denton, K. 2005). Toward a More Pragmatic Approach to Morality: A Critical Evaluation of Kohlberg’s Model, Psychological Review Vol. 112, No. 3, 629–649
- (Piaget, J. 1965). The moral judgment of the child. London: Routledge & Kegan Paul. (Original work published 1932)
- http://faculty.plts.edu/gpence/html/kohlberg.htm
What is the relationship between emotions and morals? (Social Intuitionism)
Recent work has seen the rise of “Social Intuition” as a model for explaining the role of emotion in moral judgment. For example, (Krebs & Denton 2005) found that most people, when talking about actual moral dilemmas in their life, exhibit a range of moral reasoning stages with respect to differing dilemmas. As such, the use of specific scenarios may not translate to how people think of morals in the real world. There are likely situational and individual differences which affect the stage at which moral reasoning occurs in a given situation. Asking respondents to talk about how they reason in the real world yielded a much larger effect of affect and emotion in moral reasoning than is provided for in Kohlberg's work.
Joshua Greene, a professor of psychology and neuroscience at Harvard, has done several prominent new studies on the interplay between cognition and emotion in moral judgment. One dilemma which he has focused on is the “trolley problem” where a runaway train is running down a track and the subject has to make a choice whether to take an action to save 5 people from dying (Greene, Sommerville, Nystrom, Darley, & Cohen, 2002). The dilemma is that the action will inevitably result in the death of one person who would have otherwise lived. In one condition, the action that needs to be taken is to flip a switch on the tracks. In the other condition, the action that needs to be taken is to push a man onto the tracks. Most people believe that it’s ok to flip the switch, as in the first condition, but it’s not ok to push the man onto the tracks as in the second condition. Obviously, the results in these two situations are the same and Greene explains the differing responses to these dilemmas using brain imaging technology which show higher activity in areas associated with emotion in “personal” conditions (ie. Considering pushing the man onto the tracks activated the posterior cingulated gyrus). This is clear evidence that emotions play a central role in predicting moral judgment.
In a follow-up experiment, (Greene 2004) considered another moral dilemma where a cognitively superior moral strategy can be overridden by emotional considerations. Consider this dilemma: It is war time and you are hiding from enemy soldiers with a group of 30 people. Your baby is crying and if you don’t smother your baby’s cries, the enemy will come and kill all of you, including your baby. If you smother your baby’s cries, your baby will die, but everyone else will live. Should you smother your baby? Some people say yes and some people say no, even though from a utilitarian perspective, smothering your baby is the superior outcome given that your baby will die in either case, but one finding is that this kind of dilemma causes significant activation of the anterior cingulated cortex, which is associated with resolving conflict in the brain. Using brain imaging technology, Greene also found that people who are able to override their emotional response to smothering the baby exhibit greater activation in areas associated with high level cognitive functioning (dorsolateral prefrontal cortex and inferior parietal cortex). Clearly, there is some interplay going on between cognition and emotion in resolving complex moral dilemmas.
(Jonathan Haidt 2001) attempted to isolate moral intuition from cognitive moral reasoning in the lab by providing a dilemma where a brother and sister have sex, but where the usual reasons for incest being wrong were already nullified (ie. The used 2 forms of contraception to contradict the issue of genetic harm to offspring). The result was that people still attempted to come up with reasons why the brother and sister having sex was wrong. Haidt’s conclusion was that it felt wrong, but people have a strong desire to justify their moral intuitions using reason (ie. Kohleberg’s moral reasoning). As such, Haidt has become a leading proponent of the social intuitionist approach to moral judgment where emotion rather than cognition, is the central predictor of moral judgment
Social Intuitionism vs. Moral Reasoning
Do some people lack morals (ie. criminals)?
The short answer is no. Most who follow the social intuition model would say that all non-psychopathic people moralize, even racists ((Gomberg 1990), (Haidt 2001)). For example, people who commit genocide, warped as it may be, often justify their actions on moral grounds (ie. protecting one's group, retaliating for past group injustice, or keeping their group pure). People high in psychopathy may be the exception to this and research is currently underway to determine if this is indeed the case by seeing if psychopaths exhibit lower brain activations when given moral dilemmas.
Given the ubiquity of moral reasoning and the non-intuitive diversity of moral standards that can be derived from moral reasoning (see morality & culture on this page), it seems reasonable to conclude that racist attitudes would have a moral basis and therefore be susceptible to moral reasoning. (Skitka, Bauman, and Sargis 2005) found that beliefs that lead to extreme interpersonal distancing are often linked to moral convictions. Some have even drawn parallels between racism and more universally accepted values such as patriotism (Gomberg, 1990).
Does morality vary across cultures?
Yes and no. Using the moral taxonomies above, there is evidence that different cultures all draw from the same areas of morality. ie. There is a certain amount of moralizing based on community, divinity, and autonomy across cultures, though it may vary based on degree. This 3 part taxonomy was elaborated in a book chapter by (Shweder, Much. Mahapatra and Park, 1997), where they did a cluster analysis of moral explanations regarding moral situations in a South Indian village. This 3 part taxonomy was later expanded to a 5 part taxonomy by Jonathan Haidt (Haidt and Graham, In Press) which included the ethics of harm, fairness/justice, heirarchy/authority, ingroup loyalty, and purity. Harm + fairness/justice correspond to autonomy in this model as they concern the freedom of people to pursue their own happiness and the immorality of interfering/obstructing these pursuits by others. Heirarchy/authority + ingroup loyalty roughly correspond to the ethic of community as they concern the morality that arises from the structure of social situations. Purity and divinity are related as they concern the holiness of the body and the spirit.
However, there is clear evidence that certain groups have very differing moral judgements about certain issues. For example, it has been found that the answer to the question "Are disgusting or disrespectful actions judged to be moral violations, even when they are harmless?" differs when measured in well educated Americans versus average Brazilians ((Haidt, Koller, Dias 1993)...Americans say no, Brazilians say yes). Shweder found that in an Indian village, community standards of morality disapproved of a widow eating fish or parents refusing to sleep in the same bed with their children ((Shweder, Miller and Mahapatra, 1990); (Shweder, Jensen and Goldstein, 1995); (Shweder, Much. Mahapatra and Park, 1997))
Within American culture, there is currently a rather sharp divide as to how morality is used to determine beliefs (Haidt & Graham, In Press).
What are some of the consequences of our moral reasoning?
Liberals vs. Conservatives
Do individual differences in moral intuitions predict political beliefs? Moral psychologists have begun to unpack some of the questions elaborated in the introduction to this paper. Within American culture, there is currently a rather sharp divide as to how morality is used to determine beliefs (Haidt & Graham, 2006) as, in an internet survey of over 1,500 people, conservatives were found to use all 5 moral intuitions equally to make moral judgments while liberals focused only on the ethics of harm and fairness, as measured by the Moral Foundations Questionnaire.
One can see this difference in moral foundations illustrated by the language that liberals and conservatives use. (George Lakoff 1996), in Moral Politics, wrote that conservatives tend to use words like “character, virtue, discipline, standards, authority, heritage, competition, earn, hard work, punishment, human nature, traditional, common sense, dependency, self-indulgent, elite, decay, rot, degenerate, deviant, lifestyle” while liberals use words like “social forces, social responsibility, free expression, human rights, equal rights, concern, care, help, health, safety, nutrition, basic human dignity, oppression, diversity, deprivation, alienation”. Looking at these lists, it is clear that the conservative moral vocaublary is full of words from the foundations of purity (ie. deviant, rot), authority (ie. elite, traditional) and ingroup loyalty (ie. heritage) while the list of liberal words includes only words from the foundations of harm (ie. care, safety) and fairness (ie. equal rights, oppression). This provides convergent evidence for the role that moral intuitions play in political attitudes.
Who studies it?
Useful Links
- http://plato.stanford.edu/entries/moral-psych-emp/
- Jonathan Haidt's website - http://wsrv.clas.virginia.edu/~jdh6n/
- A wonderful series of blog entries on Moral Psychology - http://mixingmemory.blogspot.com/2005/07/moral-psychology-i-where-is-morality.html