An approach towards ethics: neuroscience and development
January 31, 2015 6 Comments
For me personally it has always been a struggle, reading through all the philosophical and religious literature I have a long standing interest in, to verbalize my intuitive concept of morals in any satisfactory way. Luckily for me, once I’ve started reading up on modern psychology and neuroscience, I found out that there are empirical models based on clustering of the abundant concepts that correlate well with both our cultured intuitions and our knowledge of brain functioning. Models that are for the studies of Ethics what the Big Five traits are for personality theories or what the Cattell-Horn-Carroll theory is for cognitive abilities. In this post I’m going to provide an account of research of what is the most elucidating level of explanation of human morals – that of neuroscience and psychology. The following is not meant as a comprehensive review, but a sample of what I consider the most useful explanatory tools. The last section touches briefly upon genetic and endocrinological component of human morals, but it is nothing more than a mention. Also, I’ve decided to omit citations in quotes, because I don’t want to include into the list of reference the research I am personally unfamiliar with.
A good place to start is Jonathan Haidt’s TED talk:
Neuroscience, Psychology and Development
There are two classifications that I find the most productive when thinking about morals. The one proposed by the popular Moral Foundations Theory of Jonathan Haidt and his colleagues, which classifies intuitive axes of moral judgment and another less known Moral Intellect Theory which classifies moral competences and is developed by Markus Christen and his colleagues at the University of Zurich.
The wikipedia article has a nice overview of the six foundations:
- Care/harm for others, protecting them from harm.
- Fairness/cheating, Justice, treating others in proportion to their actions (He has also referred to this dimension as Proportionality.)
- Liberty/oppression, characterizes judgments in terms of whether subjects are tyrannized.
- Loyalty/betrayal to your group, family, nation. (He has also referred to this dimension as in-group.)
- Authority/subversion for tradition and legitimate authority. (He has also connected this foundation to a notion of Respect.)
- Sanctity/degradation, avoiding disgusting things, foods, actions. (He has also referred to this as Purity.)
The third foundation is not always on the list (for example, Haidt doesn’t list it in the above talk), but it is seen as necessary to understand libertarianism. Haidt showed, that the other five foundations represent two clusters similar across many cultures. The liberal package deems only harm and fairness axes important, while the attention of conservatives is distributed evenly between the five foundations. The Big Five trait that correlates the most with liberals in that classification is openness to experience. Haidt mentions this in the above TED speech, in which he introduces the approach and calls for seeing beyond what he calls “the moral matrix” – a call somewhat ironically misdirected, because the audience he addresses is already prevalently open-minded. Conservative people show a similar correlation with stronger emotional sensitivity to negative outcomes  and disgust  . Some further discussion of liberal and conservative packages in their relation to brain functioning can be found in this Artem’s cogsci stackexchange answer. Following this presentation, I would add, that besides libertarianism, there is another ethical system that I’d call “reduced”, which is heavily skewed towards the authority or respect axis – the one reported as characteristic for prison inmates or groups of adolescents. I believe this particular reduced ethics system needs research both on its genesis and its effects.
Haidt was at first advocating for the decisive role of intuition in moral judgment, seeing moral concepts as post-hoc rationalization. This view has been challenged from developmental perspective, which concentrates on acquiring ethical expertise. Mental faculties relevant to this expertise are classified in MI theory [1, p.122]:
- Moral Compass: The reference system containing one’s (either existing or newly formulated) moral standards, values or convictions which provide the basis for moral evaluation and regulation.
- Moral Commitment: The willingness and ability to prioritize and strive for moral goals.
- Moral Sensitivity: The ability to recognize and identify a moral issue.
- Moral Problem Solving: The ability to develop and determine a morally satisfactory course of action that resolves conflicting tendencies.
- Moral Resoluteness: The ability to build up moral behaviors by acting consistently and courageously upon moral standards, despite barriers.
The common distinction between System 1 and System 2 decisions is helpful in making sense of both intuitive and cognitive processing of moral tasks [1, p. 125]:
Though most dual-process models assume that both systems interact, there is a rich literature indicating that the prevalence of automatic or controlled processes is affected by situational and personal factors [cit.om.]. For instance, research has shown that expenditure of cognitive effort is more likely under conditions of high personal accountability (i.e., conditions were people need to justify one’s decisions and actions to others; [cit.om.]), or among people who enjoy to engage in effortful analytic activity (high in need for cognition; [cit.om.]). Opposingly, in conditions of low accountability, lack of motivation for extended reflection or lack of situational opportunities (such as time pressure, high mental workload) individuals are more likely to foster spontaneous processing [cit.om.].
The known circuitry involved in moral judgment includes multiple regions such as the ventromedial prefrontal cortex and the orbitofrontal cortex, the dorsolateral prefrontal cortex, the temporal poles, the amygdala, the posterior cingulate cortex, the posterior superior temporal sulcus, as well as the temporo-parietal junction.The analysis of the relevant activity in these regions does not even closely cover all the effects known from moral psychology, some of which I’ll describe further, but to make sense of this list, we have to consider several important things:
- We can broadly divide the brain areas and structures involved here into 3 groups: enabling the understanding of intention with ToM (the pSTS and the bilateral TPJ), assigning emotional value (the amygdala and the temporal poles, the vmPFC, the pCC), deliberate utilitarian decision making and cost-benefit analysis (the dlPFC)[4,6]
- The intention-deciphering ToM activity is what makes moral judgments different from, for example, aesthetic judgments , therefore it is also involved in recognition of moral salience. If the situation is recognized as requiring moral assessment we then firstly assess if the action was intentional and then it comes to other kinds of evaluation – the activation of pSTS and bilateral TPJ precedes the rest of cortex areas. In the developmental perspective, the integration between relevant ToM areas and other areas progresses from early childhood to the adulthood. Children are also prone to stronger empathic feelings when someone is harmed, that’s because the development of emotional regulators like PFC can’t keep up with faster development of amygdala and insular cortex . Coming back to the Moral Foundations Theory, there is some evidence that ToM is not involved with the same intensity in all kinds of moral judgments: “In a vignette study contrasting judgments about Care (accidental vs. intentional assault) and Sanctity (accidental vs. intentional incest), Young and Saxe [cit.om.] found that intentionality was central to the Care judgments, but was much less crucial for Sanctity judgments. They followed up this finding with an fMRI study and found that the right temporo-parietal junction (TPJ)—an area implicated in theory of mind reasoning, and hence intentionality judgments—was more involved in Care judgments than in Sanctity judgments. “
- Interestingly, the dlPFC, which is activated in utilitarian decision making and in patients with vmPFC lesions is the prevalent source of moral judgment is also implicated in enabling lying.
But there’s even more dynamics to our moral decision-making. The social status affects how the nature of morals is viewed, which, I think, adds some new dimensions to a bit outdated Kohlberg theory of ethical maturation. For example, there is a tendency for people of higher social status, when given to choose if the act is breaking conventions or immoral, to see the dumping of national flag into the toilet sink as not immoral, but ignoring convention, which probably reflects difference in self-construal as well (identifying with the state).
Also, people often carry opposing packages of values which can be activated by priming or framing, and the resulting ethical behavior is inconsistent. In the Handbook of Neurosociology it was illustrated by behavior of the same person during a Saturday night party and a Sunday morning church visit that followed. Without context or a certain situation a person can be reluctant to make the decision, especially when experiencing emotional states of ego depletion or mental fatigue increasing the subjective cost of decision making. I once stumbled upon a quote from Friedrich Nietzsche:
One must shed the bad taste of wanting to agree with many. “Good” is no longer good when one’s neighbor mouths it. And how should there be a “common good”! The term contradicts itself: whatever can be common always has little value. In the end it must be as it is and always has been: great things remain for the great, abysses for the profound, nuances and shudders for the refined, and, in brief, all that is rare for the rare.
I was sitting at home, with a cup of tea and reflecting on both egalitarian and elitist sympathies I had, having no context to keep my attitude at place. As far as psychology goes, similar effects are seen in neuroeconomics, when the perceived value of something depends highly on the reference frame and on manipulation of expectations. I think that our morals therefore are fundamentally circumstantial on the level of both slow and fast thinking. I, personally, don’t believe in the possibility of imperative ethics, based on some overarching universal principles.
The unfolding of our moral behavior in time is accompanied by several effects. The psychological literature shows that such phenomena as moral cleansing, moral licensing and moralization influence our personal ethics. It has been demonstrated that recalling a recent concrete subjectively moral action makes a person less likely to act morally shortly after the recalling. This compensatory behavior is reversed for immoral actions. On the contrary, a direct, non-compensatory effect is seen when a person recalls a temporally distant moral or immoral action or abstractly identifies oneself as generally moral or immoral .
Another thing that I think calls for more research is the relation of ethics and aesthetics in terms of development. My own assumption is that aesthetic preferences can turn into moral judgements over time or in specific situations. On the other hand, moral judgments, obviously, get incorporated into aesthetic preferences – try reading any classical play. Contemporary art is all about norm violation and social commentary, and often aesthetically challenging. The aforementioned difference in the BOLD signal, although helpful in distinguishing corresponding mental states, doesn’t provide any information about how they interact.
The Rest of Human Biology
The last level of analysis that I want to briefly mention here is the gene-environment interaction, which evidently affects our ethical behavior. For Big Five personality traits serving as proxies of moral behavioral patterns due to the aforementioned correlations the heritability is somewhere around 40-50%, for IQ it is even higher. Interestingly the majority of studies show that the genome shows itself through development – the older the sample, the higher the heritability. Psychopathic aggression is an example of unethical behavior highly (41%-81% Heritability) influenced by genes, such as genes for MAOA, COPT enzymes important in the regulation of aggression and corresponding endocrinology: high testosterone and low cortisol is a recipe for aggression. The role of a traumatic epigenetic trigger event is rather high. Hormones such as oxytocin, vasopressin, prolactin underlie prosocial behavior. Oxytocin, for one, is a hormone that promotes ingroup bonding, but also negative predisposition to the outgroup, so it is connected to parochial altruism (see , but cf.  and see  for some discussion offered by Patricia Churchland). Our moral decision making depends on the current physiological state, which is determined by these and even more basic mechanisms. Sometimes the fact that a person is under stress, tired, hungry or sleep-deprived might play a crucial role in moral behavior.
This post is a continuation of the previous one on the topic, concerned with primate sociality in its relation to ethics. I encourage you to read that one as well if you want to get a bigger picture! The three levels considered together – primatology, neuropsychology, and human biology prepare a background against which the cultural aspects can be scrutinized.
- van Schaik, C., Fischer, J., Huppenbauer, M., & Tanner, C. (2013). Empirically Informed Ethics: Morality Between Facts and Norms. Springer.
- Avram, M., Gutyrchik, E., Bao, Y., Pöppel, E., Reiser, M., & Blautzik, J. (2013). Neurofunctional correlates of esthetic and moral judgments. Neuroscience Letters, 534, 128-32 PMID: 23262080
- Decety, J., & Cacioppo, S. (2012). The speed of morality: a high-density electrical neuroimaging study. Journal of neurophysiology, 108(11), 3068-3072.
- Fumagalli, M., & Priori, A. (2012). Functional and clinical neuroanatomy of morality. Brain, 135(7), 2006-2021.
- Caravita, S. C., Giardino, S., Lenzi, L., Salvaterra, M., & Antonietti, A. (2012). Socio-economic factors related to moral reasoning in childhood and adolescence: the missing link between brain and behavior. Frontiers in human neuroscience, 6.
- FeldmanHall, O., Mobbs, D., & Dalgleish, T. (2014). Deconstructing the brain’s moral network: dissociable functionality between the temporoparietal junction and ventro-medial prefrontal cortex. Social cognitive and affective neuroscience, 9(3), 297-306.
- Decety, J., & Howard, L. H. (2013). The role of affect in the neurodevelopment of morality. Child Development Perspectives, 7(1), 49-54.
- Graham, J., Haidt, J., Koleva, S., Motyl, M., Iyer, R., Wojcik, S., & Ditto, P. H. (2013). Moral foundations theory: The pragmatic validity of moral pluralism.Advances in experimental social psychology, 47, 55-130.
- Franks, D. D., & Turner, J. H. (2012). Handbook of neurosociology. Springer.
- Conway, P., & Peetz, J. (2012). When does feeling moral actually make you a better person? Conceptual abstraction moderates whether past moral deeds motivate consistency or compensatory behavior. Personality and Social Psychology Bulletin, 38(7), 907-919.
- Joel, S., Burton, C. M., & Plaks, J. E. (2014). Conservatives anticipate and experience stronger emotional reactions to negative outcomes. Journal of personality, 82(1), 32-43.
- Inbar, Y., Pizarro, D., Iyer, R., & Haidt, J. (2012). Disgust sensitivity, political conservatism, and voting. Social Psychological and Personality Science, 3(5), 537-544.
- De Dreu, C. K., Greer, L. L., Van Kleef, G. A., Shalvi, S., & Handgraaf, M. J. (2011). Oxytocin promotes human ethnocentrism. Proceedings of the National Academy of Sciences, 108(4), 1262-1266.
- Van IJzendoorn, M. H., & Bakermans-Kranenburg, M. J. (2012). A sniff of trust: meta-analysis of the effects of intranasal oxytocin administration on face recognition, trust to in-group, and trust to out-group. Psychoneuroendocrinology, 37(3), 438-443.
- Churchland, P. S., & Winkielman, P. (2012). Modulating social behavior with oxytocin: how does it work? What does it mean?. Hormones and behavior, 61(3), 392-399.