Andreas Wilke

Clarkson University, Potsdam, New York, United States

Are you Andreas Wilke?

Claim your profile

Publications (28)90.3 Total impact

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.
    Journal of Gambling Studies 04/2015; DOI:10.1007/s10899-015-9539-9 · 1.47 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Humans internalize environmental cues of mortality risk at an early age, which influences subsequent risk perceptions and behavior. In this respect, an individual's current risk assessment may be viewed as an adaptive response to the dangers present within his or her early local environment. Here we examine the relationship between several variables indicating threat within an individual's early environment (e.g., prevalence of violent and property crimes, registered sex offenders) and their perception of crime risk within both the childhood and current adult environments. We recruited a group of 657 students who hail from diverse geographic backgrounds to provide the zip code location of their childhood residence along with subjective ratings of danger of that and their current location, which enabled us to compare their ratings of risk/danger with the federally reported crime statistics of each setting. Our results indicate that the early prevalence of registered sex offenders indeed influences an individual's risk perception in adulthood, and that these factors have a differential effect on males and females. Our findings provide support for the theory that early environmental factors signaling danger affect how individuals assess risk within their adult environment. © The Author(s) 2015.
    Journal of Interpersonal Violence 03/2015; DOI:10.1177/0886260515572473 · 1.64 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Human decision-makers often exhibit the hot-hand phenomenon, a tendency to perceive positive serial autocorrelations in independent sequential events. The term is named after the observation that basketball fans and players tend to perceive streaks of high accuracy shooting when they are demonstrably absent. That is, both observing fans and participating players tend to hold the belief that a player’s chance of hitting a shot are greater following a hit than following a miss. We hypothesize that this bias reflects a strong and stable tendency among primates (including humans) to perceive positive autocorrelations in temporal sequences, that this bias is an adaptation to clumpy foraging environments, and that it may even be ecologically rational. Several studies support this idea in humans, but a stronger test would be to determine whether nonhuman primates also exhibit a hot-hand bias. Here we report behavior of 3 monkeys performing a novel gambling task in which correlation between sequential gambles (i.e., temporal clumpiness) is systematically manipulated. We find that monkeys have better performance (meaning, more optimal behavior) for clumped (positively correlated) than for dispersed (negatively correlated) distributions. These results identify and quantify a new bias in monkeys’ risky decisions, support accounts that specifically incorporate cognitive biases into risky choice, and support the suggestion that the hot-hand phenomenon is an evolutionary ancient bias.
    Journal of Experimental Psychology: Animal Learning and Cognition 07/2014; 40(3):280. DOI:10.1037/xan0000033 · 2.38 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Does problem gambling arise from an illusion that patterns exist where there are none? Our prior research suggested that “hot hand,” a tendency to perceive illusory streaks in sequences, may be a human universal, tied to an evolutionary history of foraging for clumpy resources. Like other evolved propensities, this tendency might be expressed more stongly in some people than others, leading them to see luck where others see only chance. If the desire to gamble is enhanced by illusory pattern detection, such individual differences could be predictive of gambling risk. While previous research has suggested a potential link between cognitive strategies and propensity to gamble, no prior study has directly measured gamblers’ cognitive strategies using behavioral choice tasks, and linked them to risk-taking or gambling propensities. Using a computerized sequential decision-making paradigm that directly measured subjects’ predictions of sequences, we found evidence that subjects who have a greater tendency to gamble also have a higher tendency to perceive illusionary patterns, as measured by their preferences for a random slot machine over a negatively autocorrelated one. Casino gamblers played the random slot machine significantly more often even though a training phase and a history of outcomes was provided. Additionally, we found a marginally significant group difference between gamblers and matched community members in their slot-machine choice proportions. Performance on our behavioral choice task correlated with subjects’ risk attitudes towards gambling and their frequency of play, as well as the selection of choice strategies in gambling activities.
    Evolution and Human Behavior 07/2014; 35(4). DOI:10.1016/j.evolhumbehav.2014.02.010 · 2.87 Impact Factor
  • Source
    01/2014; 8(3):123-141. DOI:10.1037/ebs0000011
  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Knowledge about cue polarity (i.e., the sign of a cue-criterion relation) seems to boost performance in a wide range of inference tasks. Knowledge about cue polarity information may enhance performance by increasing (1) the reliance on rule- relative to similarity-based strategies, and (2) explicit knowledge about the relative importance of cues. We investigated the relative contribution of these two mechanisms in a multiple-cue judgment task and a categorization task, which typically differ in the inference strategies they elicit and potentially the explicit task knowledge available to participants. In both tasks participants preferred rule-based relative to similarity-based strategies and had more knowledge about cue importance when cue polarity information was provided. Strategy selection was not related to increases in performance in the categorization task and could only partly explain increases in performance in the judgment task. In contrast, explicit knowledge about the importance of cues was related to better performance in both categorization and judgment independently of the strategy used. In sum, our results suggest that the benefits of receiving cue polarity information may span across tasks, such multiple-cue judgment and categorization, primarily by enhancing knowledge of relative cue importance.
    Acta psychologica 06/2013; 144(1):73-82. DOI:10.1016/j.actpsy.2013.05.007 · 2.19 Impact Factor
  • Source
    Rui Mata, Andreas Wilke, Uwe Czienskowski
    [Show abstract] [Hide abstract]
    ABSTRACT: Does foraging change across the life span, and in particular, with aging? We report data from two foraging tasks used to investigate age differences in search in external environments as well as internal search in memory. Overall, the evidence suggests that foraging behavior may undergo significant changes across the life span across internal and external search. In particular, we find evidence of a trend toward reduced exploration with increased age. We discuss these findings in light of theories that postulate a link between aging and reductions in novelty seeking and exploratory behavior.
    Frontiers in Neuroscience 04/2013; 7:53. DOI:10.3389/fnins.2013.00053
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Three alternative mechanisms for age-related decline in memory search have been proposed, which result from either reduced processing speed (global slowing hypothesis), overpersistence on categories (cluster-switching hypothesis), or the inability to maintain focus on local cues related to a decline in working memory (cue-maintenance hypothesis). We investigated these 3 hypotheses by formally modeling the semantic recall patterns of 185 adults between 27 to 99 years of age in the animal fluency task (Thurstone, 1938). The results indicate that people switch between global frequency-based retrieval cues and local item-based retrieval cues to navigate their semantic memory. Contrary to the global slowing hypothesis that predicts no qualitative differences in dynamic search processes and the cluster-switching hypothesis that predicts reduced switching between retrieval cues, the results indicate that as people age, they tend to switch more often between local and global cues per item recalled, supporting the cue-maintenance hypothesis. Additional support for the cue-maintenance hypothesis is provided by a negative correlation between switching and digit span scores and between switching and total items recalled, which suggests that cognitive control may be involved in cue maintenance and the effective search of memory. Overall, the results are consistent with age-related decline in memory search being a consequence of reduced cognitive control, consistent with models suggesting that working memory is related to goal perseveration and the ability to inhibit distracting information. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
    Developmental Psychology 04/2013; 49(12). DOI:10.1037/a0032272 · 3.21 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: When predicting the next outcome in a sequence of events, people often appear to expect streaky patterns, such as that sport players can develop a “hot hand,” even if the sequence is actually random. This expectation, referred to as positive recency, can be adaptive in environments characterized by resources that are clustered across space or time (e.g., expecting to find multiple berries on separate bushes). But how strong is this disposition towards positive recency? If people perceive random sequences as streaky, will there be situations in which they forego a payoff because they prefer an unpredictable random environment over an exploitable but alternating pattern? To find out, 238 participants repeatedly chose to bet on the next outcome of one of two sequences of (binary) events, presented next to each other. One sequence displayed events at random while the other sequence was either more streaky (positively autocorrelated) or more alternating (negatively autocorrelated) than chance. The degree of autocorrelation varied in a between-subject design. Most people preferred to predict purely random sequences over those with moderate negative autocorrelation and thus missed the opportunity for above-chance payoff. Positive recency persisted despite extensive feedback and the opportunity to learn more rewarding behavior over time. Further, most participants' choice strategies were best described by a win-stay/lose-shift strategy, adaptive in clumpy or streaky environments. We discuss the implications regarding an evolved human tendency to expect streaky patterns, even if the sequence is actually random.
    Evolution and Human Behavior 09/2011; 32(5-32):326-333. DOI:10.1016/j.evolhumbehav.2010.11.003 · 2.87 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Previous research reported conflicting results concerning the influence of depression on cognitive task performance. Whereas some studies reported that depression enhances performance, other studies reported negative or null effects. These discrepant findings appear to result from task variation, as well as the severity and treatment status of participant depression. To better understand these moderating factors, we study the performance of individuals-in a complex sequential decision task similar to the secretary problem-who are nondepressed, depressed, and recovering from a major depressive episode. We find that depressed individuals perform better than do nondepressed individuals. Formal modeling of participants' decision strategies suggested that acutely depressed participants had higher thresholds for accepting options and made better choices than either healthy participants or those recovering from depression.
    Journal of Abnormal Psychology 04/2011; 120(4):962-8. DOI:10.1037/a0023238 · 4.86 Impact Factor
  • Source
    Andreas Wilke, Peter M Todd
    [Show abstract] [Hide abstract]
    ABSTRACT: Ambientes pasados y presentes: la evolución de la toma de decisiones. La mente humana está repleta de mecanismos evolutivos de toma de decisiones diseñados para alcanzar importantes metas adaptativas. En este artículo se traza un marco para el estudio de dichos mecanismos desde la perspectiva de la psicología evolucionista, haciendo énfasis en la importancia de las numerosas influencias que el ambiente tiene en la formación y funcionamiento de las estrategias de decisión. Estas estrategias adoptan a menudo la forma de heurísticos sencillos construidos a partir de componentes que, a su vez, emanan de capacidades que han evolucionado, encajando con determinadas estructuras de información del entorno. Aquí se ilustran estas ideas con dos ejemplos de heurísticos empleados en dominios adaptativos importantes: decidir cuándo abandonar un área con recursos y predecir cuándo una secuencia de eventos se detendrá o continuará.
    Psicothema 02/2010; 22(1):4-8. · 0.96 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: A casual look at the literature in social cognition reveals a vast collection of biases, errors, violations of rational choice, and failures to maximize utility. One is tempted to draw the conclusion that the human mind is woefully muddled. We present a three-category evolutionary taxonomy of evidence of biases: biases are (a) heuristics, (b) error management effects, or (c) experimental artifacts. We conclude that much of the research on cognitive biases can be profitably reframed and understood in evolutionary terms. An adaptationist perspective suggests that the mind is remarkably well designed for important problems of survival and reproduction, and not fundamentally irrational. Our analysis is not an apologia intended to place the rational mind on a pedestal for admiration. rather, it promises practical outcomes including a clearer view of the architecture of systems for judg- ment and decision making, and exposure of clashes between adaptations designed for the ancestral past and the demands of the present. By casually browsing journals in the social sciences one can discover a collection of human biases, errors, violations of rational choice, and failures to maximize utility. Papers published in Social Cognition are illustrative. In just 2007, the journal published a special issue dedicated to the hindsight bias, which is the tendency to believe that events that have occurred are more probable when assessing them after the fact than when estimating them prospectively (Blank, Musch, & Pohl, 2007). Other examples include misapprehensions of probability like the hot hand fallacy that leads people to erroneously believe that basketball players who have shot several successful baskets are more likely to succeed on the next try (Gilovich, Vallone, & Tversky, 1985). There are also many effects of emotion purported to cloud good judgment (e.g., Leith & Baumeister, 1996), overuses of stereotypes (Ross & Nisbett, 1991), misapprehensions of the motives of members of the op- posite sex (Abbey, 1982), common violations of monetary utility in behavioral eco-
    Social Cognition 10/2009; DOI:10.1521/soco.2009.27.5.733 · 1.64 Impact Factor
  • Source
    Rui Mata, Andreas Wilke, Uwe Czienskowski
    [Show abstract] [Hide abstract]
    ABSTRACT: We conducted two experiments comparing younger and older adults' ability to adjust their foraging behavior as a function of task characteristics. Participants foraged for fish in a virtual landscape and had to decide when to move between ponds so as to maximize the number of fish caught. In the first experiment, participants were left to generate their own foraging strategy, whereas in the second experiment, participants were instructed to use an incremental strategy that has been shown to produce optimal performance in this task. Our results suggest that both younger and older adults are adaptive in the sense of adjusting the parameters of their foraging strategy as a function of task characteristics. Nevertheless, older adults show overall poorer performance compared with younger adults even when instructed to use an optimal strategy.
    The Journals of Gerontology Series B Psychological Sciences and Social Sciences 07/2009; 64(4):474-81. DOI:10.1093/geronb/gbp035 · 2.85 Impact Factor
  • Source
    Andreas Wilke, H. Clark Barrett
    [Show abstract] [Hide abstract]
    ABSTRACT: The hot hand phenomenon refers to the expectation of “streaks” in sequences of hits and misses whose probabilities are, in fact, independent (e.g., coin tosses, basketball shots). Here we propose that the hot hand phenomenon reflects an evolved psychological assumption that items in the world come in clumps, and that hot hand, not randomness, is our evolved psychological default. In two experiments, American undergraduates and Shuar hunter–horticulturalists participated in computer tasks in which they predicted hits and misses in foraging for fruits, coin tosses, and several other kinds of resources whose distributions were generated randomly. Subjects in both populations exhibited the hot hand assumption across all the resource types. The only exception was for American students predicting coin tosses where hot hand was reduced. These data suggest that hot hand is our evolved psychological default, which can be reduced (though not eliminated) by experience with genuinely independent random phenomena like coin tosses.
    Evolution and Human Behavior 05/2009; 30(3-3):161-169. DOI:10.1016/j.evolhumbehav.2008.11.004 · 2.87 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Animals depleting one patch of resources must decide when to leave and switch to a fresh patch. Foraging theory has predicted various decision mechanisms; which is best depends on environmental variation in patch quality. Previously we tested whether these mechanisms underlie human decision making when foraging for external resources; here we test whether humans behave similarly in a cognitive task seeking internally generated solutions. Subjects searched for meaningful words made from random letter sequences, and as their success rate declined, they could opt to switch to a fresh sequence. As in the external foraging context, time since the previous success and the interval preceding it had a major influence on when subjects switched. Subjects also used the commonness of sequence letters as a proximal cue to patch quality that influenced when to switch. Contrary to optimality predictions, switching decisions were independent of whether sequences differed little or widely in quality.
    Cognitive Science 05/2009; 33(3):497-529. DOI:10.1111/j.1551-6709.2009.01020.x · 2.38 Impact Factor
  • Source
    X. T. Wang, Daniel J. Kruger, Andreas Wilke
    [Show abstract] [Hide abstract]
    ABSTRACT: We examined the effects of life-history variables on risk-taking propensity, measured by subjective likelihoods of engaging in risky behaviors in five evolutionarily valid domains of risk, including between-group competition, within-group competition, environmental challenge, mating and resource allocation, and fertility and reproduction. The effects of life-history variables on risk-taking propensity were domain specific, except for the expected sex difference, where men predicted greater risk-taking than women in all domains. Males also perceived less inherent risk in actions than females across the five domains. Although the age range in the sample was limited, older respondents showed lower risk propensity in both between-and within-group competition. Parenthood reduced risk-taking propensity in within-and between-group competitions. Higher reproductive goal setting (desiring more offspring) was associated with lower risk-taking propensity. This effect was strongest in the risk domains of mating and reproduction. Having more siblings reduced risk-taking propensity (contrary to our initial prediction) in the domains of environmental challenge, reproduction, and between-group competition. Later-born children showed a higher propensity to engage in environmental and mating risks. Last, shorter subjective life expectancy was associated with increased willingness to take mating and reproductive risks. These results suggest that life-history variables regulate human risk-taking propensity in specific risk domains.
    Evolution and Human Behavior 03/2009; 30(2). DOI:10.1016/j.evolhumbehav.2008.09.006 · 2.87 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We agree that much of language evolution is likely to be adaptation of languages to properties of the brain. However, the attempt to rule out the existence of language-specific adaptations a priori is misguided. In particular, the claim that adaptation to "moving targets" cannot occur is false. Instead, the details of gene-culture coevolution in language are an empirical matter.
    Behavioral and Brain Sciences 11/2008; 31(5):511-512. DOI:10.1017/S0140525X08005013 · 14.96 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We used a computer game to examine three aspects of patch-leaving decisions in humans: how well do humans perform compared to the optimal policy, can they adjust their behaviour adaptively in response to different distributions of prey across patches and on what cues are their decisions based? Subjects earned money by catching fish when they briefly appeared within a pond; the timing of appearances was stochastic but at a rate proportional to how many fish remained. Caught fish were not replaced and ponds varied in how many fish they initially contained (according to three different distributions). At any point subjects could move to a new pond, but travel took some time. They delayed this switch much too long. Furthermore, regardless of the distribution of prey, subjects spent longer at ponds where they had found more items (contrary to optimality predictions in two of the environments). However, they apparently responded not to the number of captures directly (despite this appearing on screen) but to the current interval without a capture, to the interval preceding the last capture, and to the time spent at the current pond. Self-reports supported this order of cue importance. Subjects often left directly after a capture, perhaps an example of the Concorde fallacy. High success rate in the preceding patch decreased residence time and subjects appeared to be learning to leave earlier over the latter two thirds of the experiment. Minimization of delay to the next capture alone might explain some of the suboptimal behaviour observed.
    Animal Behaviour 04/2008; DOI:10.1016/j.anbehav.2007.09.006 · 3.07 Impact Factor
  • Karthik Panchanathan, Andreas Wilke
    Evolution and Human Behavior 11/2007; 28(6):448–450. DOI:10.1016/j.evolhumbehav.2007.08.004 · 2.87 Impact Factor

Publication Stats

383 Citations
90.30 Total Impact Points

Institutions

  • 2010–2015
    • Clarkson University
      • • Department of Psychology
      • • Department of Civil & Environmental Engineering
      Potsdam, New York, United States
  • 2009
    • University of Lisbon
      Lisboa, Lisbon, Portugal
  • 1990–2009
    • Max Planck Institute for Human Development
      • Center of Adaptive Behavior and Cognition
      Berlín, Berlin, Germany
  • 2007–2008
    • University of California, Los Angeles
      • Department of Anthropology
      Los Ángeles, California, United States