[show abstract][hide abstract] ABSTRACT: A German-language scale assessing tendencies to engage in risky behaviors, as well as perceptions of risks and expected benefits from such behaviors, is derived from an English version and validated on 532 German participants. The scale contains 40 items in six distinct domains of risk taking: ethical, recreational, health, social, investing, and gambling. Following a risk-return model of risk taking, perceived-risk attitude is inferred by regressing risk-taking on perceived risk and expected benefits. Risk-taking as well as perceptions of risks and benefits were domain-specific, while perceived-risk attitudes were more similar across domains, thus supporting the use of a risk-return framework for interpreting risk-taking propensity. Gender and cultural comparisons are drawn, and we discuss possibilities for future cross-cultural applications of the scale.
[show abstract][hide abstract] ABSTRACT: Knowledge about cue polarity (i.e., the sign of a cue-criterion relation) seems to boost performance in a wide range of inference tasks. Knowledge about cue polarity information may enhance performance by increasing (1) the reliance on rule- relative to similarity-based strategies, and (2) explicit knowledge about the relative importance of cues. We investigated the relative contribution of these two mechanisms in a multiple-cue judgment task and a categorization task, which typically differ in the inference strategies they elicit and potentially the explicit task knowledge available to participants. In both tasks participants preferred rule-based relative to similarity-based strategies and had more knowledge about cue importance when cue polarity information was provided. Strategy selection was not related to increases in performance in the categorization task and could only partly explain increases in performance in the judgment task. In contrast, explicit knowledge about the importance of cues was related to better performance in both categorization and judgment independently of the strategy used. In sum, our results suggest that the benefits of receiving cue polarity information may span across tasks, such multiple-cue judgment and categorization, primarily by enhancing knowledge of relative cue importance.
[show abstract][hide abstract] ABSTRACT: Three alternative mechanisms for age-related decline in memory search have been proposed, which result from either reduced processing speed (global slowing hypothesis), overpersistence on categories (cluster-switching hypothesis), or the inability to maintain focus on local cues related to a decline in working memory (cue-maintenance hypothesis). We investigated these 3 hypotheses by formally modeling the semantic recall patterns of 185 adults between 27 to 99 years of age in the animal fluency task (Thurstone, 1938). The results indicate that people switch between global frequency-based retrieval cues and local item-based retrieval cues to navigate their semantic memory. Contrary to the global slowing hypothesis that predicts no qualitative differences in dynamic search processes and the cluster-switching hypothesis that predicts reduced switching between retrieval cues, the results indicate that as people age, they tend to switch more often between local and global cues per item recalled, supporting the cue-maintenance hypothesis. Additional support for the cue-maintenance hypothesis is provided by a negative correlation between switching and digit span scores and between switching and total items recalled, which suggests that cognitive control may be involved in cue maintenance and the effective search of memory. Overall, the results are consistent with age-related decline in memory search being a consequence of reduced cognitive control, consistent with models suggesting that working memory is related to goal perseveration and the ability to inhibit distracting information. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
[show abstract][hide abstract] ABSTRACT: Does foraging change across the life span, and in particular, with aging? We report data from two foraging tasks used to investigate age differences in search in external environments as well as internal search in memory. Overall, the evidence suggests that foraging behavior may undergo significant changes across the life span across internal and external search. In particular, we find evidence of a trend toward reduced exploration with increased age. We discuss these findings in light of theories that postulate a link between aging and reductions in novelty seeking and exploratory behavior.
[show abstract][hide abstract] ABSTRACT: Previous research reported conflicting results concerning the influence of depression on cognitive task performance. Whereas some studies reported that depression enhances performance, other studies reported negative or null effects. These discrepant findings appear to result from task variation, as well as the severity and treatment status of participant depression. To better understand these moderating factors, we study the performance of individuals-in a complex sequential decision task similar to the secretary problem-who are nondepressed, depressed, and recovering from a major depressive episode. We find that depressed individuals perform better than do nondepressed individuals. Formal modeling of participants' decision strategies suggested that acutely depressed participants had higher thresholds for accepting options and made better choices than either healthy participants or those recovering from depression.
Journal of Abnormal Psychology 04/2011; 120(4):962-8. · 4.86 Impact Factor
[show abstract][hide abstract] ABSTRACT: When predicting the next outcome in a sequence of events, people often appear to expect streaky patterns, such as that sport players can develop a “hot hand,” even if the sequence is actually random. This expectation, referred to as positive recency, can be adaptive in environments characterized by resources that are clustered across space or time (e.g., expecting to find multiple berries on separate bushes). But how strong is this disposition towards positive recency? If people perceive random sequences as streaky, will there be situations in which they forego a payoff because they prefer an unpredictable random environment over an exploitable but alternating pattern? To find out, 238 participants repeatedly chose to bet on the next outcome of one of two sequences of (binary) events, presented next to each other. One sequence displayed events at random while the other sequence was either more streaky (positively autocorrelated) or more alternating (negatively autocorrelated) than chance. The degree of autocorrelation varied in a between-subject design. Most people preferred to predict purely random sequences over those with moderate negative autocorrelation and thus missed the opportunity for above-chance payoff. Positive recency persisted despite extensive feedback and the opportunity to learn more rewarding behavior over time. Further, most participants' choice strategies were best described by a win-stay/lose-shift strategy, adaptive in clumpy or streaky environments. We discuss the implications regarding an evolved human tendency to expect streaky patterns, even if the sequence is actually random.
[show abstract][hide abstract] ABSTRACT: The human mind is filled with evolved decision mechanisms designed to meet adaptively important goals. In this article we lay out a framework for studying those mechanisms from the perspective of evolutionary psychology, emphasizing the importance of multiple influences of the environment on shaping the decision strategies and their operation. These strategies often take the form of simple heuristics constructed from building blocks that draw on evolved capacities, all of which fit to particular information structures in the environment. We illustrate these ideas with two examples of heuristics used in important adaptive domains: deciding when to leave a resource patch, and predicting when a sequence of events will stop or continue.
[show abstract][hide abstract] ABSTRACT: We conducted two experiments comparing younger and older adults' ability to adjust their foraging behavior as a function of task characteristics. Participants foraged for fish in a virtual landscape and had to decide when to move between ponds so as to maximize the number of fish caught. In the first experiment, participants were left to generate their own foraging strategy, whereas in the second experiment, participants were instructed to use an incremental strategy that has been shown to produce optimal performance in this task. Our results suggest that both younger and older adults are adaptive in the sense of adjusting the parameters of their foraging strategy as a function of task characteristics. Nevertheless, older adults show overall poorer performance compared with younger adults even when instructed to use an optimal strategy.
The Journals of Gerontology Series B Psychological Sciences and Social Sciences 07/2009; 64(4):474-81. · 3.01 Impact Factor
[show abstract][hide abstract] ABSTRACT: Animals depleting one patch of resources must decide when to leave and switch to a fresh patch. Foraging theory has predicted various decision mechanisms; which is best depends on environmental variation in patch quality. Previously we tested whether these mechanisms underlie human decision making when foraging for external resources; here we test whether humans behave similarly in a cognitive task seeking internally generated solutions. Subjects searched for meaningful words made from random letter sequences, and as their success rate declined, they could opt to switch to a fresh sequence. As in the external foraging context, time since the previous success and the interval preceding it had a major influence on when subjects switched. Subjects also used the commonness of sequence letters as a proximal cue to patch quality that influenced when to switch. Contrary to optimality predictions, switching decisions were independent of whether sequences differed little or widely in quality.
[show abstract][hide abstract] ABSTRACT: We examined the effects of life-history variables on risk-taking propensity, measured by subjective likelihoods of engaging in risky behaviors in five evolutionarily valid domains of risk, including between-group competition, within-group competition, environmental challenge, mating and resource allocation, and fertility and reproduction. The effects of life-history variables on risk-taking propensity were domain specific, except for the expected sex difference, where men predicted greater risk-taking than women in all domains. Males also perceived less inherent risk in actions than females across the five domains. Although the age range in the sample was limited, older respondents showed lower risk propensity in both between-and within-group competition. Parenthood reduced risk-taking propensity in within-and between-group competitions. Higher reproductive goal setting (desiring more offspring) was associated with lower risk-taking propensity. This effect was strongest in the risk domains of mating and reproduction. Having more siblings reduced risk-taking propensity (contrary to our initial prediction) in the domains of environmental challenge, reproduction, and between-group competition. Later-born children showed a higher propensity to engage in environmental and mating risks. Last, shorter subjective life expectancy was associated with increased willingness to take mating and reproductive risks. These results suggest that life-history variables regulate human risk-taking propensity in specific risk domains.
[show abstract][hide abstract] ABSTRACT: A casual look at the literature in social cognition reveals a vast collection of biases, errors, violations of rational choice, and failures to maximize utility. One is tempted to draw the conclusion that the human mind is woefully muddled. We present a three-category evolutionary taxonomy of evidence of biases: biases are (a) heuristics, (b) error management effects, or (c) experimental artifacts. We conclude that much of the research on cognitive biases can be profitably reframed and understood in evolutionary terms. An adaptationist perspective suggests that the mind is remarkably well designed for important problems of survival and reproduction, and not fundamentally irrational. Our analysis is not an apologia intended to place the rational mind on a pedestal for admiration. rather, it promises practical outcomes including a clearer view of the architecture of systems for judg- ment and decision making, and exposure of clashes between adaptations designed for the ancestral past and the demands of the present. By casually browsing journals in the social sciences one can discover a collection of human biases, errors, violations of rational choice, and failures to maximize utility. Papers published in Social Cognition are illustrative. In just 2007, the journal published a special issue dedicated to the hindsight bias, which is the tendency to believe that events that have occurred are more probable when assessing them after the fact than when estimating them prospectively (Blank, Musch, & Pohl, 2007). Other examples include misapprehensions of probability like the hot hand fallacy that leads people to erroneously believe that basketball players who have shot several successful baskets are more likely to succeed on the next try (Gilovich, Vallone, & Tversky, 1985). There are also many effects of emotion purported to cloud good judgment (e.g., Leith & Baumeister, 1996), overuses of stereotypes (Ross & Nisbett, 1991), misapprehensions of the motives of members of the op- posite sex (Abbey, 1982), common violations of monetary utility in behavioral eco-
[show abstract][hide abstract] ABSTRACT: The hot hand phenomenon refers to the expectation of “streaks” in sequences of hits and misses whose probabilities are, in fact, independent (e.g., coin tosses, basketball shots). Here we propose that the hot hand phenomenon reflects an evolved psychological assumption that items in the world come in clumps, and that hot hand, not randomness, is our evolved psychological default. In two experiments, American undergraduates and Shuar hunter–horticulturalists participated in computer tasks in which they predicted hits and misses in foraging for fruits, coin tosses, and several other kinds of resources whose distributions were generated randomly. Subjects in both populations exhibited the hot hand assumption across all the resource types. The only exception was for American students predicting coin tosses where hot hand was reduced. These data suggest that hot hand is our evolved psychological default, which can be reduced (though not eliminated) by experience with genuinely independent random phenomena like coin tosses.
Evolution and Human Behavior. 01/2009; 30(3):161-169.
[show abstract][hide abstract] ABSTRACT: We agree that much of language evolution is likely to be adaptation of languages to properties of the brain. However, the attempt to rule out the existence of language-specific adaptations a priori is misguided. In particular, the claim that adaptation to "moving targets" cannot occur is false. Instead, the details of gene-culture coevolution in language are an empirical matter.
Behavioral and Brain Sciences 11/2008; 31(5):511-512. · 18.57 Impact Factor
[show abstract][hide abstract] ABSTRACT: We used a computer game to examine three aspects of patch-leaving decisions in humans: how well do humans perform compared to the optimal policy, can they adjust their behaviour adaptively in response to different distributions of prey across patches and on what cues are their decisions based? Subjects earned money by catching fish when they briefly appeared within a pond; the timing of appearances was stochastic but at a rate proportional to how many fish remained. Caught fish were not replaced and ponds varied in how many fish they initially contained (according to three different distributions). At any point subjects could move to a new pond, but travel took some time. They delayed this switch much too long. Furthermore, regardless of the distribution of prey, subjects spent longer at ponds where they had found more items (contrary to optimality predictions in two of the environments). However, they apparently responded not to the number of captures directly (despite this appearing on screen) but to the current interval without a capture, to the interval preceding the last capture, and to the time spent at the current pond. Self-reports supported this order of cue importance. Subjects often left directly after a capture, perhaps an example of the Concorde fallacy. High success rate in the preceding patch decreased residence time and subjects appeared to be learning to leave earlier over the latter two thirds of the experiment. Minimization of delay to the next capture alone might explain some of the suboptimal behaviour observed.
[show abstract][hide abstract] ABSTRACT: Evolutionary psychologists should go beyond research on individual differences in attitudes and focus more on detailed models of psychological mechanisms. We argue for complementing attitude research with agent-based computational modeling of mate choice. Agent-based models require detailed specification of individual choice mechanisms that can be evaluated in terms of both their psychological plausibility and the population-level outcomes they produce.
[show abstract][hide abstract] ABSTRACT: Pidgins are protolanguages used by people from different linguistic backgrounds who are brought together to live and work, whereas creoles are true languages. More convincing yet of children's collective ability to invent language comes from a generation of deaf Nicaraguans who had not been exposed to a developed language and who, prior to attending a new school for the deaf, communicated using idiosyncratic home-sign systems. Shortly after arriving at the school, these home signers developed a shared system of signs and grammatical devices. This shared system developed into a full-fledged sign language after several years and several cohorts of typically young, deaf individuals without the need for instructions or adult models (Senghas & Coppola 2001; Senghas et al. 2004). The emergence of new skills, such as language or its antece-dents, in a group of individuals can place them in novel contexts and expose them to new selection pressures. This would surely have been the case with the emergence of language and its underlying symbolic abilities. We argue, as have others (e.g., Gottlieb 2002; Lickliter & Schneider, in press; West-Eberhard 2003), that the neural plasticity of infants and children and their behavioral and cognitive responses to novel environments provide much of the stuff upon which natural selection works, and that this may have been especially important in recent human cognitive evolution (e.g., Bjorklund 2006). Such plasticity may continue to afford the opportunity for phylogenetic change in Homo sapiens. For instance, the Flynn effect, a steady rise in IQ (particularly fluid intelligence) over the past century, may be due to accelerated cognitive development (Howard 2001), perhaps in response to an increasingly visual environment (see Neisser 1988). We do not believe that the human race is on the verge of a radical evolutionary change; but the neural plasticity evident in contemporary children in response to changing environments likely also charac-terized our ancestors and contributed centrally to the emergence of language and related sociocognitive abilities in our forechildren. Abstract: One way of dealing with the proliferation of conjectures that accompany the diverse study of the evolution of language is to develop precise and testable models which reveal otherwise latent implications. We suggest how verbal theories of the role of individual development in language evolution can benefit from formal modeling, and vice versa.
Behavioral and Brain Sciences - BEHAV BRAIN SCI. 01/1990; 29(03).
[show abstract][hide abstract] ABSTRACT: More frequent risk taking among young men than women has been explained as a sexually selected trait, perhaps advertising male quality. We investigated this hypothesis in three studies. (1) Young men and women rated how attractive they would find it if a potential partner took various specific risks. A domain-specific risk inventory allowed us to distinguish whether risk taking is attractive generally or only in certain domains. Both sexes reported social and recreational risk taking as attractive (the latter not always significantly so), but other domains of risk taking as unattractive (ethics, gambling, and health) or neutral (investment). The sexes differed markedly little. Parallel studies in Germany and the United States yielded very similar results. (2) We asked subjects to predict how attractive the other sex would find it if the subject performed each risky behavior. Both sexes were rather accurate (which could be merely because they assume that the other sex feels as they do) and sex differences in attractive risk taking are not explicable by sex differences either in attraction or in beliefs about what others find attractive. However, our data could explain why unattractive risks are more often taken by men than women (men slightly underestimated the degree of unattractiveness of such risks, whereas U.S. women overestimated it, perhaps because they themselves found such risk taking more unattractive than did U.S. men). (3) Both members of 25 couples reported their likelihood of engaging in specific risky behaviors, their perception of these risks, and how attractive they would have found these behaviors in their partner. One hypothesis was that, for instance, a woman afraid of heights would be particularly impressed by a man oblivious to such risks. Instead we found positive assortment for risk taking, which might be explained by a greater likelihood of encountering people with similar risk attitudes (e.g. members of the same clubs) or a greater compatibility between such mates. Finally, contrary to the assumption that taking a low risk is likely to be less revealing of an individual's quality than taking a high risk, we found a strong negative