Aspects of the Theory of Moral Cognition: Investigating Intuitive Knowledge of the Prohibition of Intentional Battery and the Principle of Double Effect

SSRN Electronic Journal 05/2002; DOI: 10.2139/ssrn.762385

ABSTRACT Where do our moral intuitions come from? Are they innate? Does the brain contain a module specialized for moral judgment? Does the human genetic program contain instructions for the acquisition of a sense of justice or moral sense? Questions like these have been asked in one form or another for centuries. In this paper we take them up again, with the aim of clarifying them and developing a specific proposal for how they can be empirically investigated. The paper presents data from six trolley problem studies of over five hundred individuals, including one group of Chinese adults and one group of American children, which suggest that both adults and children ages 8-12 rely on intuitive knowledge of moral principles, including the prohibition of intentional battery and the principle of double effect, to determine the permissibility of actions that require harming one individual in order to prevent harm to others. Significantly, the knowledge in question appears to be merely tacit: when asked to explain or justify their judgments, subjects were consistently incapable of articulating the operative principles on which their judgments appear to have been based. We explain these findings with reference to an analogy to human linguistic competence. Just as normal persons are typically unaware of the principles guiding their linguistic intuitions, so too are they often unaware of the principles guiding their moral intuitions. These studies pave the way for future research by raising the possibility that specific poverty of the stimulus arguments can be formulated in the moral domain. Differences between our approach to moral cognition and those of Piaget (1932), Kohlberg (1981), and Greene et al. (2001) are also discussed.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Two experiments examined biases in children’s (5/6- and 7/8-year-olds) and adults’ moral judgments. Participants at all ages judged that it was worse to produce harm when harm occurred (a) through action rather than inaction (omission bias), (b) when physical contact with the victim was involved (physical contact principle), and (c) when the harm was produced as a direct means to an end rather than as an unintended but foreseeable side effect of the action (intention principle). The youngest participants, however, did not incorporate benefit when making judgments about situations in which harm to one individual resulted in benefit to five individuals. Older participants showed some preference for benefit resulting from action (commission) as opposed to inaction (omission). The findings are discussed in the context of the theory that moral judgments result, in part, from the operation of an inherent, intuitive moral faculty compared with the theory that moral judgments require development of necessary cognitive abilities.
    Journal of Experimental Child Psychology 09/2012; 113(1):186–193. DOI:10.1016/j.jecp.2012.03.006 · 3.12 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Increasingly, psychologists and neuroscientists have become interested in moral psychology and moral judgment. Despite this, much of moral philosophy remains isolated from this empirical research. I seek to integrate these two literatures. Drawing on a wide range of research, I develop an empirically adequate account of moral judgment. I then turn to issues in philosophical moral psychology, arguing that empirical research sheds light on old debates and raises new questions for investigation. The neuropsychological mechanisms underlying moral judgment exhibit a large degree of complexity. Different processes contribute to moral judgment under different conditions, depending both upon the kind of case under consideration and on individual differences. Affective processes subserved by a broad base of brain regions including the orbitofrontal cortex, ventromedial prefrontal cortex, amygdala, and basal ganglia are crucial for normal moral judgment. These affective processes also provide an important link to motivation. More explicit cognition dependent upon areas of the medial temporal lobe and the dorsolateral prefrontal cortex also play a crucial role in some kinds of moral judgment though they exhibit less direct connections to motivation. The descriptive account of moral judgment I defend makes sense of debates in moral psychology over two influential views: motivation internalism, according to which moral judgment necessitates motivation to act accordingly and the Humean Theory of Motivation, according to which belief and desire are distinct and motivation requires both a desire and an appropriate means-end belief. Moral judgments that derive from affective processes exhibit a connection between motivation and moral judgment. However, not all moral judgments derive from such processes. More explicit representations are not closely connected to motivation, thus motivation can come apart from moral judgment. While explicit beliefs are distinct from desires, affective representations have both cognitive (albeit nonpropositional) content and direct connections to motivation. This challenges Humean theories of motivation. This account helps resolve these traditional disputes. Anti-Humean, internalist theories offer an approximately accurate account of these affective mechanisms. Externalist, Humean theories offer an approximately accurate account of more explicit cognitive processes. Thus, several prominent philosophical theories offer a plausible account of some aspect of moral psychology. Because of the complexity of moral psychology, none of these accounts offers a complete account. This account also raises new questions for investigation. Some researchers have argued that the representation of a moral rule like the Doctrine of Double Effect helps explain the pattern of judgments in response to different kinds of Trolley cases. I argue that these judgments are better explained in terms of the details of the associative mechanisms underlying these judgments and not in terms of the representation of a moral rule. These findings raise a unique concern about the evidential value of our intuitions in these cases—a concern that could not arise from armchair reflection alone. The approach taken in this dissertation illustrates how integrating the results of empirical research contributes to philosophical work in ethical theory.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Is moral judgment accomplished by intuition or conscious reasoning? An answer demands a detailed account of the moral principles in question. We investigated three principles that guide moral judgments: (a) Harm caused by action is worse than harm caused by omission, (b) harm intended as the means to a goal is worse than harm foreseen as the side effect of a goal, and (c) harm involving physical contact with the victim is worse than harm involving no physical contact. Asking whether these principles are invoked to explain moral judgments, we found that subjects generally appealed to the first and third principles in their justifications, but not to the second. This finding has significance for methods and theories of moral psychology: The moral principles used in judgment must be directly compared with those articulated in justification, and doing so shows that some moral principles are available to conscious reasoning whereas others are not.
    Psychological Science 01/2007; 17(12):1082-9. DOI:10.1111/j.1467-9280.2006.01834.x · 4.43 Impact Factor


Available from