Article

Adaptive Thinking: Rationality in the Real World

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Where do new ideas come from? What is social intelligence? Why do social scientists perform mindless statistical rituals? This vital book is about rethinking rationality as adaptive thinking: to understand how minds cope with their environments, both ecological and social. The author proposes and illustrates a bold new research program that investigates the psychology of rationality, introducing the concepts of ecological, bounded, and social rationality. His path-breaking collection takes research on thinking, social intelligence, creativity, and decision-making out of an ethereal world where the laws of logic and probability reign, and places it into our real world of human behavior and interaction. This book is accessibly written for general readers with an interest in psychology, cognitive science, economics, sociology, philosophy, artificial intelligence, and animal behavior. It also teaches a practical audience, such as physicians, AIDS counselors, and experts in criminal law, how to understand and communicate uncertainties and risks.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... The concept of bounded rationality (e.g., Simon, 1987Simon, , 1990, as interpreted by Gigerenzer (2000Gigerenzer ( , 2004, has been characterized via the fast and frugal heuristics notion (Raab, 2012(Raab, , 2021 for making decisions that have good enough ("satisficing") outcomes. This notion is not new, with its roots planted in the bounded rationality concept that was originally addressed in the 1940s by 1978 Nobel Laureate, Herbert Simon. ...
... It should be noted that interpretations of both bounded rationality (Simon, 1987(Simon, , 1990 and fast and frugal heuristics (Gigerenzer, 2000(Gigerenzer, , 2004 are descriptive rather than prescriptive. In other words, while the traditional normative understanding of rationality optimally focuses on the prescriptive production of benchmark models for maximizing utility, bounded rationality and fast and frugal heuristics focus on descriptively reasonable guiding concepts (e.g., Gigerenzer, Czerlinski & Martignon, 2002). ...
... One means for overcoming such cognitive biases in selection decisions in sport could be the de-biasing of the experts who are involved in the process (Fischhoff, 1982). Eliminating the legitimacy of optimal models as benchmarks (Gigerenzer et al., 2002) and replacing them with concepts such as fast and frugal heuristics and adaptive rationality (Gigerenzer, 2000(Gigerenzer, , 2004Raab, 2012Raab, , 2021 could also offer a positive alternative. However, if we do recognize that talent-selection decisions may legitimately use unbounded rationality that is operationalized through optimal decomposition models (e.g., regression/bootstrapping or Bayesian models), then the door is open for utilizing big-data analyses (e.g., Elitzur, 2020;Morgulev, Azar & Lidor, 2018)as a means for maximizing the accuracy of talent-selection decisions regarding the athletes' professional future. ...
Article
When making talent-selection decisions in sport, coaches, scouts, program directors, and policymakers typically adopt two approaches: The subjective approach, also known as the coach's eye, where these professionals select or de-select athletes based on their personal observations and impressions; and the objective approach, where they apply a multi-faceted formula for awarding scores to the athletes' motor skills (such as agility and coordination) and psychological capabilities (such as leadership and motivation)-as a mean for predicting their future success. These two approaches are often perceived as complementary in the strive to reach optimal selection decisions in sport. In this conceptualized article, we examine challenges associated with such talent-selection decisions, and address the coach's eye as an example of a subjective assessment approach. We also address the concept of fast and frugal heuristics for making selection decisions in sport, while elaborating on bounded rationality and the human machine paradigm. Finally, in addition to discussing certain "built-in" limitations in sport-selection decisions , based on judgment and decision-making models, we provide a rationale for adding the big-data approach, as a mean for enhancing links between the subjective and objective assessments currently used in talent-selection decisions in sport.
... In the literature on decision-making under time pressure and stress, intuition has been hailed as a powerful tool (e.g. Akinci & Sadler-Smith, 2012;Dane & Pratt, 2007Gigerenzer, 2000;Hodgkinson et al., 2008;Klein, 1998Klein, , 2003. In contrast, the heuristics and biases tradition sees intuition as a source of error, implying that more analytic decision-makers are less biased and better performers (e.g. ...
... In a recent metaanalysis examining the effects of analytic and intuitive thinking, Alaybek et al. (2021) found a significant positive effect of analytic style but no effect of intuitive style on performance. This finding contradicts the research stream holding intuition as a "fast and frugal" decision-making style (Gigerenzer, 2000;Gigerenzer et al., 2011;Klein, 1998Klein, , 2003Klein, , 2008. Alaybek et al. (2021) remarked on the abundance of theorizing about the interactive effects of intuitive and analytic styles and the lack of empirical studies. ...
... Klein (2003) has a different view. He argues that an analytic style would interfere with expert intuition in crisis management and thereby reduce the effectiveness of the intuitive style: "From the perspective of intuitive decision-making, conscious analysis is the bottleneck" (Klein, 2003, p. 68; see also Gigerenzer, 2000;Hodgkinson & Sadler-Smith, 2018;Kahneman & Klein, 2009). Moreover, psychometricians have established that modelling the effects of two variables without including their interaction term produces inaccurate estimates and misleading interpretations of the main effects when a significant interaction effect exists (e.g. ...
Article
Full-text available
The impact of intuitive and analytic cognitive styles on task performance is a much‐debated subject in the scientific discourse on decision‐making. In the literature on decision‐making under time pressure, intuition has been regarded as a fast and frugal tool. At the same time, the heuristics and biases tradition sees intuition as a source of errors, implying that more analytic decision‐makers are less biased and better performers. We conducted two studies of the effects of interplay between intuitive and analytic cognitive styles on decision‐making in a simulated wicked learning environment. The results of the first study revealed that the high‐performing individuals were those who exhibited a strong preference for both cognitive styles, as well as those who showed a lack of preference for both. Individuals with a strong preference for only one of the styles were outperformed. In the second study, we replicated these findings in a team context. Post‐hoc, we found that cognitive ability correlated highly with performance for the two high‐performing style combinations but not for the two low‐performing style combinations. Our results indicate that flexible style preferences boost the effect of cognitive ability, while strong preferences for a single style may entrench even those with high cognitive abilities.
... During those two transitional decades, Gigerenzer (2000) identifies two ways in which the new parallel between computers and humans was drawn. The first one, represented by von Neumann, finds its source in McCulloch and Pitts's (1943) influential logical model of neural activity. ...
... Prior to implementing their computational model of the mind on a digital computer, Allen Newell and Herbert Simon simulated their program on "a computer constructed of human components". Simon observed that "It was the task of each participant to execute his or her subroutine, or to provide the contents of his or her memory, whenever called by the routine at the next level above that was then in control" (quoted in Gigerenzer, 2000). As Gigerenzer (2000) points out, this situation looks no different from the one set up by de Prony's at his Bureau. ...
... Simon observed that "It was the task of each participant to execute his or her subroutine, or to provide the contents of his or her memory, whenever called by the routine at the next level above that was then in control" (quoted in Gigerenzer, 2000). As Gigerenzer (2000) points out, this situation looks no different from the one set up by de Prony's at his Bureau. But we can indeed see a difference in the fact that the concrete social relations between actual humans remain now obscured-to them no less than to any observer-by the idea that the computer they collectively implement is a scale model of the brain instead of a scale model of society. ...
... Gleichwohl zum menschlichen Entscheidungsverhalten eine lange Forschungstradition vorliegt (v.a. aus der Kognitionspsychologie und Verhaltensbiologie), haben ökonomische Betrachtungen hierzu erst in jüngeren Forschungen Aufmerksamkeit erlangt (beispielhaft Gigerenzer 2000). Des Weiteren liegen bislang nur wenige Studien vor, die entwicklungspsychologische oder lernpsychologische Aspekte differenziert betrachten und dabei der Frage nachgehen, wie Entscheidungskompetenzen erworben werden (Lindow 2014, Löckenhoff 2018. ...
... Ihr ständiger Wettstreit kann zu widersprüchlichen Entscheidungstendenzen führen. Generell-interventionistische Theorien wiederum unterstellen, dass der intuitive Modus standardmäßig zuerst aktiviert ist.Der deliberative Modus kann, muss aber nicht, intervenieren.Welcher Modus als der überlegenere gelten kann, lässt sich nicht pauschal beantworten: Gigerenzer verdeutlicht in mehreren Studien, wie intuitives Entscheiden in Situationen der Unsicherheit oder unter Zeitdruck zu einem besseren Ergebnis führen kann als das Abwägen aller Alternativen(Gigerenzer & Todd 1999, Gigerenzer & Selten 2001, Gigerenzer 2000.Er argumentiert, dass Intuitionen in vielen Fällen effektiver sind, da sie sich an den Gegebenheiten aus der Umwelt orientieren. In stark strukturierten Umwelten, in denen regelmäßige Muster und wiederkehrende Situationen auftreten, haben Menschen gelernt, durch Erfahrung effektive Heuristiken zu entwickeln. ...
Article
Die berufliche Praxis von Menschen in verantwortlichen Rollen ist durch komplexe Konstellationen gekennzeichnet, die häufig mit Dilemmata und Paradoxien behaftet sind. Entscheidungen in Führungspositionen, insbesondere in pädagogischen Organisationen, sind oft vielschichtig und erfordern eine Ab- wägung divergierender Interessen. Verschiedene Entscheidungstheorien bieten hierzu Modelle, die sowohl rationale als auch intuitive Entscheidungsprozesse integrieren. Pädagogische Führungskräfte müssen zudem nicht nur Informationen analysieren, sondern auch soziale und ethische Faktoren berücksichtigen. Der Beitrag betont die Bedeutung von Werten und professioneller Reflexion in Entscheidungssituationen aus einer bildungsökonomischen Perspektive.
... On the one hand, dominant internalist views assume that rational decisions are the results of individual cognitive abilities and limitations. On the other hand, ecological views assign a decisive role in rational decision-making to environments (Gigerenzer, 2000) or situations (Popper, 1962(Popper, [1945). Post-Northians occupy an 'intermediate' position between these poles. ...
... The FFH programme (an acronym for 'fast-and-frugal heuristics'), led by Gerd Gigerenzer (2000Gigerenzer ( , 2021, is directed against the internalism inherent in the HB&N programme and its fixation on the strong cognitive limitations (irrationality) of individuals. FFH scholars share Simon's (1990) scissorslike view of bounded rationality that is shaped by two 'blades': mental and environmental parts of cognition. ...
Article
Full-text available
In the philosophy of mind and cognitive science, there is a pronounced paradigm shift associated with the transition from internalism to externalism. The externalist paradigm views cognitive processes as not isolated in the brain, but as interrelated with external artefacts and structures. The paper focuses on one of the leading externalist approaches-extended cognition. Despite the dominance of internalism in economics, in its main schools, there is an emerging trend towards extended cognition ideas. In my opinion, economists might develop the most advanced version of the extended cognition approach: socially extended cognition based on cognitive institutions. This paper analyses extended cognition ideas in institutional, Austrian, and behavioural economics and identifies numerous overlapping approaches and complementary research areas. I argue that the economics of cognitive institutions is a promising field for all economic schools and propose a preliminary research agenda.
... Zihinsel psikoloji alanında yaygın olarak kullanılan zihinsel kısayollar (heuristics) konusunda başlıca şu kaynaklara müracaat edilebilir:Gilovich, Griffin and Kahneman, 2002;Gigerenzer, 2000Gigerenzer, , 2007Gigerenzer, ,2008Gigerenzer & Todd & Group. 1999. ...
... 1 Zihinsel psikoloji alanında bazı temel kaynaklar için bkz: Anderson, 1993Anderson, , 2010Ariely, 2008Ariely, , 2012Ayduk, & Mischel 2002;Baron, 2000;Bechtel, 2008;Bishop, & Trout, 2004;Boden, 2006;Chalmers, 1996;Friedenberg & Silverman, 2005; Gilovich, Griffin & Kahneman, 2002; Gilovich,1993;Gigerenzer, 2000Gigerenzer, , 2007Gigerenzer, , 2008Johnson-Laird, P. 1988;Kahneman, 2011;Kahneman, Slovic, Tversky, 1982;Kahneman, D., Knetsch, and Thaler. 1991;Kahneman, & Tversky, 1984;Kahneman, D. and Thaler, 2005; Koehler & Harvey. ...
Article
Full-text available
Güç zehirlenmesi olgusunun ortaya çıkmasında sınırsız ve genellikle kontrol edil(e)meyen siyasi güce sahip olmak önem taşır. Makama, mevkiye, otoriteye karşı istek ve arzusu olan bir siyasi lider eğer uzun süreli ve kalıcı mutlak siyasi güce sahip olursa sahip olduğu kişilik özellikleri (narsisizm, Makyavelizm ve psikopati) etkisini göstererek su yüzüne çıkar ve “güç zehirlenmesi” adı verilen bir olgu görülür. Hele ki, siyasi liderin partizanlar ve millet tarafından ilahlaştırılması sözkonusu olduğunda hubris lider gerçeklikle temasını tamamen kaybederek bir zehirli lidere dönüşebilir. “Zehirli siyasi liderlik” iktidarda bulunan haris ve hubris bir siyasetçinin zafer sarhoşluğuna ve güç zehirlenmesine yenik düşmesi neticesinde mutlak iktidarını kalıcı hale getirme maksadıyla herestetik sanatını icra ederek her türlü makyavelist araçları ve siyasi manipülasyon yollarını fütursuzca ve zalimane şekilde kullanması anlamına gelir.
... On the one hand, dominant internalist views assume that rational decisions are the results of individual cognitive abilities and limitations. On the other hand, ecological views assign a decisive role in rational decision-making to environments (Gigerenzer, 2000) or situations (Popper, 1962(Popper, [1945). Post-Northians occupy an 'intermediate' position between these poles. ...
... The FFH program (an acronym for 'fast-and-frugal heuristics'), led by Gerd Gigerenzer (2000Gigerenzer ( , 2021, is directed against the internalism inherent in the HB&N program and its fixation on the strong cognitive limitations (irrationality) of individuals. FFH scholars share Simon's (1990) scissors-like view of bounded rationality that is shaped by two 'blades': mental and environmental parts of cognition. ...
Article
Full-text available
In the philosophy of mind and cognitive science, there is a pronounced paradigm shift associated with the transition from internalism to externalism. The externalist paradigm views cognitive processes as not isolated in the brain, but as interrelated with external artifacts and structures. The paper focuses on one of the leading externalist approaches-extended cognition. Despite the dominance of internalism in economics, in its main schools, there is an emerging trend toward extended cognition ideas. In my opinion, economists might develop the most advanced version of the extended cognition approach: socially extended cognition based on cognitive institutions. This paper analyzes extended cognition ideas in institutional, Austrian, and behavioral economics and identifies numerous overlapping approaches and complementary research areas. I argue that the economics of cognitive institutions is a promising field for all economic schools and propose a preliminary research agenda.
... In relation to the hot-hand phenomenon by Gilovich, Vallone and Tversky (1985), the concept of simple/fast and frugal heuristics by Gigerenzer (2000) laid a cornerstone for subsequent investigations in heuristic decision making in sports (Raab, 2012). In simple words, a heuristic is a simple rule of thumb, also described as "a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods" (Gigerenzer & Gaissmaier, 2011, p. 454). ...
... Empirical Findings on Economic Approach. The hot-hand belief is generally related to economic approaches in the understanding of human behavior, however, it is also closely linked to the simple/fast and frugal heuristics framework in sports (Gigerenzer, 2000, Raab, 2012. Beside the mentioned examinations in the rather isolated penalty kick situations (Bar-Eli et al., 2007, 2009, the studies of Csapo investigated the hot hand phenomenon sportspecifically in open-play situations in basketball. ...
Thesis
Full-text available
Across high-performance sports, athletic attributes such as the physical, physiological, or anthropometrical ones are found to be performance-discriminating and -determining factors in sports. However, the determination of an expert athlete should consider a multidimensional view on an athlete’s performance capabilities, taking perceptual-cognitive skills into account, also being essential for high-level performances. Still, there is an ongoing debate in science of how to set up performance environments best that combines perceptual-cognitive and motor skills of athletes to ultimately identify expertise in sports. The purpose of this thesis was to create and evaluate a multidimensional diagnostic tool that is able to capture perceptual-cognitive behavior and expertise with a representative task design and the involvement of complex, sport-specific motor responses. A domain-specific performer environment was created, with an ecological dynamics and cognitive approach, and the expert performance approach (Ericsson & Smith, 1991) as guiding frameworks from the literature. The designed representative, sensorimotor test consisted of varying attack actions presented on a life-size projection screen, and multiple, pre-specified defensive actions on a pressure contact plate system. First, the test was checked for reproducibility in test retest sessions, by analyzing the agreement of the given motor defense responses of team-handball players within a temporal- occlusion test paradigm. Results indicated reliable test metrics, as moderate agreement of the motor responses with the majority of the attack situations were revealed. Furthermore, players also gave faster responses with more visual information in the attack sequences, giving support for the practicability of the complex temporal-occlusion paradigm. Second, comparisons of response frequencies between elite and amateur team-handball players should reveal differences in complex decision-making behavior. Contrasting the performances of both groups, elite players demonstrated significantly more often a rather offensive-orientated response behavior, whereas amateurs showed significant preferences for rather defensive-orientated response behavior. The absence of decision time differences suggests that decision quality might be of stronger relevance than presumed in heuristic decision-making processes. Third, with regard to embodied choices (Lepora & Pezzulo, 2015), further between-group comparisons were drawn with an in situ paradigm in a simple heuristic (Raab, 2012) context. Elite players were found to invest additional time for achieving higher accuracies in decisions significantly, pointing to speed-accuracy trade-offs in expert heuristic decision making. The findings in this thesis demonstrated the methodological complexity of analyzing perceptual- cognitive performances with respect to current scientific theories and models, when assessing expert anticipation and decision making. The overall results contribute to recent developments in psychology and cognition science in sports, while having several implications for theory and practice.
... Individual learning formalizes all human exploratory behaviors centered on trial-and-error, e.g., heuristic strategies and various reinforcement learning algorithms 33 . Given the social context we focus on (binary state space, dynamic and repeated interactions), simple trial-and-reflection may be more applicable than other complex learning mechanisms 34,35 . The detailed decision process is as follows: ...
Preprint
Full-text available
Cooperation on social networks is crucial for understanding human survival and development. Although network structure has been found to significantly influence cooperation, some human experiments indicate that it cannot fully explain the evolutionary patterns of cooperation. While evidence suggests that this gap arises from human exploration, our understanding of its impact mechanisms and characteristics, such as asymmetry, remains limited. Here, we seek to formalize human exploration as an individual learning process involving trial and reflection, and integrate social learning to examine how their interdependence shapes cooperation. We find that individual learning can alter the imitative tendency of social learning, while its cooperative tendency in turn relies on social learning and players' decision preferences. By analyzing a series of key experiments, we show that the above coupled dynamics can explain human behavior in game interactions. Furthermore, we find that individual learning can promote cooperation when its probability is negatively correlated with payoffs, a mechanism rooted in the psychological tendency to avoid trial-and-error when individuals are satisfied with their current payoffs. These results link long-unexplained asymmetric exploration with the cooperation-promoting ability of social networks, helping to bridge the gap between theoretical research and reality.
... The notion of ecological rationality was offered by Gerd Gigerenzer and colleagues as an alternative way of understanding heuristics (Gigerenzer and Todd, 1999a;Gigerenzer, 2000). Focusing on the interaction between cognition and the environment, ecological rationality explores the idea that particular heuristics might be adaptive solutions in particular environments. ...
Preprint
Full-text available
A new approach to understanding irrational behavior that provides a framework for deriving new models of human cognition. What does it mean to act rationally? Mathematicians, economists, and statisticians have argued that a rational actor chooses actions that maximize their expected utility. And yet people routinely act in ways that violate this prescription. Our limited time and computational resources mean that it is often unrealistic to consider all options in order to choose the one that has the greatest utility. This book suggests a different approach to understanding irrational behavior: resource-rational analysis. By reframing questions of rational action in terms of how we should make the best use of our limited resources, the book offers a new take on fundamental questions at the heart of cognitive psychology, behavioral economics, and the design of artificial intelligence systems. The book presents a formal framework for applying resource-rational analysis to understand and improve human behavior, a set of tools developed by the authors to make this easier, and examples of how they have used this approach to revisit classic questions about human cognition, pose new ones, and enhance human rationality. The book will be a valuable resource for psychologists, economists, and philosophers as well as neuroscientists studying human brains and minds and computer scientists working to reproduce such systems in machines.
... But recent work in psychology has suggested a different, "ecological" conception of rationality, according to which what is rational behavior or rational thinking is context-as well as actor/thinker-dependent. Specifically, Gigerenzer (2000), Elqayam (2012), Schurz and Hertwig (2019), and others, have argued that rather than following a small number of universally valid principles, rationality is a matter of being able to pick the right learning tools for each particular situation, where tools that may help one achieve one's cognitive goals in one situation may fail one in another, and where what are the right tools for one person may not be the right tools for another person. 12 Here, we want to argue that whether, and if so in what form, analogical reasoning is rational is context-dependent in precisely this way. ...
Article
Full-text available
Analogical reasoning is a form of non-deductive reasoning that gives special weight to similarity considerations. Here, we pursue an approach to formalizing this type of reasoning that was initiated by Carnap in posthumously published work. In it, Carnap abandoned his long-time project of trying to define inductive and analogical reasoning syntactically and introduced attribute spaces to model the meanings of predicates. While these spaces remain underdeveloped in Carnap’s late work, it is clear that what he envisioned is, or is close to, what are presently known as “conceptual spaces.” We use the conceptual spaces framework as it has been developed over the past two decades to make progress on formally representing analogical reasoning. This will also allow us to address the question of the normative status of analogical reasoning, which was raised by Carnap and others but which has hitherto remained unanswered. Finally, we work out some of the empirical content of our proposal and reanalyze a publicly available dataset to test it.
... Ambiguity leans towards the present due to the impact of mental models and heuristics. Social-psychological research revealed that human attitudes and choices depend on simplified heuristics (i.e., availability heuristic) [112][113][114]. These intuitive heuristics are presence-oriented and correspond with satisfying rather than optimizing strategies [27]. ...
Article
Full-text available
Background Problems such as climate change, environmental pollution, nuclear disposal and unsustainable production and consumption share a common feature: they pose long-term challenges because of their complex nature, potentially severe consequences, and the demanding problem-solving paths. These challenges may have long-lasting impacts on both present and future generations and, therefore, require to be addressed through a long-term governance perspective, i.e., coherent and consistent policy-making across sectors, institutions, and temporal scales. Dealing with these challenges is a core task of policy-making in modern societies, which requires problem-solving skills and capabilities. In this context, we identify long-term governance traces in the literature, illustrate the case of energy transition towards renewable energy systems as a long-term governance case, and elaborate on the scope and definition of long-term governance and its research. Main text We elaborate an analytical framework for long-term governance (LTG), based on five building blocks: the ‘environment’, which details the policy-making arena; the ‘policy issues’, which elaborates on the problems to be dealt with by LTG; the ‘key challenges and driving force’, revealing LTG mechanisms; the ‘key strategies’, in which promising approaches for LTG are identified; and the ‘policy cycle’, where governance impacts on different policy phases are discussed. In essence, we understand long-term governance at its core as a reflexive policy-making process to address significant enduring and persistent problems within a strategy-based decision-making arena to best prepare for, navigate through, and experiment with a changing environment. Conclusions The framework does not describe specific processes or individual cases in detail. Instead, it should be understood as an illustration of long-term governance characteristics at a more general level. Such a framework may help to structure the field of long-term policy-making, guide future research on conceptual, comparative, and empirical in-depth studies, and may provide orientation and action knowledge for making our governance system sustainable. Stimulating and broadening research on long-term issues seems indispensable, given the existence of several ‘grand challenges’ that require successful long-term governance.
... Kita dapat mengembangkan pemikiran adaptif dengan terus belajar dan dengan menjadi terbuka terhadap ide-ide baru. Gigerenzer (2000) menjelaskan individu dan organisasi dapat mengembangkan kemampuan berpikir adaptif untuk membuat keputusan yang baik dalam situasi yang tidak pasti dan berubah-ubah. ...
Book
Full-text available
Manajemen Industri 5.0 semakin populer. Manajemen Industri 5.0 memprioritaskan manusia dan lingkungan, sebagaimana dijabarkan sepenuhnya dalam proses dan prospek digitalisasi industri. Manajemen Industri 5.0 dapat dinyatakan sebagai kelanjutan rancangan Industri 4.0. Tugas dan prosedural yang belum terselesaikan dalam Industri 4.0 dilanjutkan dan dilengkapi oleh Industri 5.0 melalui Value Co-Creation kehormatan manusia dan kelestarian lingkungan hidup sehingga menghasilkan ekonomi keberlanjutan. Dengan demikian Manajemen Industri 5.0 = Industri 4.0 + Manusia + Lingkungan.
... In AA worden bovengenoemde stellingen verdedigd tegen de achtergrond van werk over ecologische rationaliteit zoals dat met name te vinden is in de psychologische literatuur. Terwijl de meeste filosofen vasthouden aan de opvatting dat rationaliteitsprincipes-welke dat ook precies zijn-overal en altijd gelden, heeft gedurende de laatste twee decennia onder cognitief psychologen een ecologische benadering van rationaliteit sterk aan populariteit gewonnen (zie bijvoorbeeld Gigerenzer & Goldstein, 1996;Gigerenzer, 2000;Elqayam, 2011Elqayam, , 2012. Volgens deze benadering kunnen we vragen over de rationaliteit van een overtuiging, of verandering van overtuiging, of handeling, slechts beantwoorden met betrekking tot een specifieke persoon in een specifieke context. ...
Article
Full-text available
Victor Gijsbers, Fred Muller en Eric Schliesser hebben zich gronding verdiept in The Art of Abduction (AA) en hebben er een uitvoerige beschouwing aan gewijd. De beschouwing bevat veel interessante observaties en ook de nodige kritiek. In deze repliek geef ik eerst een korte weergave van de belangrijkste stellingen van AA en reageer dan op de kritiek.
... However, his calculations make no sense unless his data are independent and identically distributed.25 Within environments where such premises hold, it may of course be adaptive for organisms to develop inductive propensities, whose scope would be more or less tied to the domain of the relevant material premises.Barkow et al. (1992) develops this theme with reference to the evolution of domain-specific mechanisms of learning and induction;Gigerenzer (2000);Gigerenzer et al. (1999) consider proximate mechanisms and ecological aspects, andHolland et al. (1986) proposes a unified framework for modeling such inductive propensities in terms of generate-and-test processes. All of this, however, is more within the field of psychology than either statistics or philosophy, as (to paraphrase the philosopher Ian Hacking 2001) it does not so much solve the problem of induction as evade it. ...
Preprint
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework.
... We propose that inferences based on pseudocontingencies result in a bias in food environments, given the significant changes that have occurred in the food environment over recent decades relative to the food environment to which our ancestors were exposed. Literature has repeatedly stressed that the efficacy of a strategy that was previously adaptive can be diminished when the environment it was designed for changes (e.g., within the adaptive toolbox framework; Gigerenzer, 2002;Gigerenzer & Gaissmaier, 2011). The accessibility and affordability of highly processed, unhealthy foods has increased dramatically for much of the world's population, contributing significantly to the obesity epidemic (Baker & Friel, 2016;Swinburn et al., 2011). ...
... One such potential model (with rudimentary 'if/then' classifications like those employed in tree felling) is coined Fast and Frugal Decision Trees (43). These heuristics are fast, with limited computation, and frugal, only using some of the perceived information, instead relying on rules of searching for information, stopping search, then making a decision (44,45,46). They are transparent, easy to teach and learn, and readily used by practitioners (40,42). ...
Technical Report
Full-text available
Skilled workers operating in uncertain environments develop a ‘sixth sense’ over time that affords them automatic responses, allowing attention resources to focus on processing unexpected events. This ability is called ‘embodied cognition’ – a framework that emphasises the significance of the worker’s physical body in cognitive processing. The idea is that the body’s interactions with the environment contribute to cognition, the mental action or process of acquiring knowledge and understanding through the senses. Scion’s Human Factors team takes a pragmatic approach to investigating decision making using this framework to see how it might influence the future of safe practice. Preliminary findings suggest that expert tree fallers have well-established proficiency enabling instantaneous decision making in volatile situations. To leverage this ability, the aim is to first map out the cognitive differences between experts and novices using physical and emotional measurements. Then, those somatic markers will be used to capture strategies processed automatically during expert decision making. Adaptive rules of thumb, ‘simple heuristics’, will be derived and harnessed to design an on-the-job learning approach. Ultimately, the methodology for studying embodied cognition in dynamic contexts could serve to amplify and extend human capability beyond safety critical tasks, changing the way workers interact with the operational forest environment.
... For canonical work, see Kahneman (1974, 1983),Kahneman (2011).24 For canonical work, seeGigerenzer and Todd (2001);Gigerenzer (2002Gigerenzer ( , 2008Gigerenzer ( , 2010. 25 I regret not being able to give a more nuanced discussion to this comparison. I will however note one complication: one might wonder whether all biases studied by Kahneman and Tversky can be framed as solutions to underdetermination problems. ...
Article
Full-text available
The concept of bias is pervasive in both popular discourse and empirical theorizing within philosophy, cognitive science, and artificial intelligence. This widespread application threatens to render the concept too heterogeneous and unwieldy for systematic investigation. This article explores recent philosophical literature attempting to identify a single theoretical category—termed ‘bias’—that could be unified across different contexts. To achieve this aim, the article provides a comprehensive review of theories of bias that are significant in the fields of philosophy of mind, cognitive science, machine learning, and epistemology. It focuses on key examples such as perceptual bias, implicit bias, explicit bias, and algorithmic bias, scrutinizing their similarities and differences. Although these explorations may not conclusively establish the existence of a natural theoretical kind, pursuing the possibility offers valuable insights into how bias is conceptualized and deployed across diverse domains, thus deepening our understanding of its complexities across a wide range of cognitive and computational processes.
... People do not make optimal predictive judgments under uncertainty, but rather rely on heuristics and biases (Kahneman et al., 1982;Tversky & Kahneman, 1974, and much subsequent research). Gigerenzer (2000) refers to this as a cognitive illusion. Further discussion of calibration and biases can be found in Alexander (2013, and the papers reviewed there). ...
Article
Full-text available
How confident a student is about how they answer a question has important education implications. Participants answered 10 mathematics questions and provided their estimates of how likely they got each individual item correct and how many, in total, they answered correctly. They were overconfident in these metacognitive judgments. Some of the participants were asked to justify why their answers were either correct or incorrect prior to making these judgments. This lowered their confidence ratings. They were still overconfident, but less than those in the control group. The instruction also affected the association between the confidence ratings and accuracy. No differences were observed between those asked to justify why their responses were correct versus those asked to justify why their responses were incorrect. Those asked to think about the accuracy of a response had lower confidence. This has important implications for understanding how we construct confidence judgments and within education how student confidence can be affected during assessments.
... This very primary foraging for what is external to the organism so that it may be internalized by the organism -this transaction of energy patterns from food to work to waste -is at the heart of inquiry. It forms the basis of what Gerd Gigerenzer rightly describes as the unconscious intelligence of our gut feelings (2007; see also Gigerenzer 2000Gigerenzer , 2008Elster 2000). We are patterns and thus bounded, interacting with other bounded patterns which we may incorporate within our bounds for purposes of viability and creativity. ...
Article
Full-text available
We articulate a conception of resilience via allostasis and the free energy principle to augment Nassim Nicholas Taleb’s conception of antifragility. Creative resilience is resilience 3.0, after robustness (1.0) and antifragility (2.0), because creative resilience is the deliberate effort to construct ecological niches toward a more caring and thus more viable world for more people – what Dewey proffered as the moral ideal of creative democracy. Viability is understood as the healthy tension between stability and precarity. Viability is related to regulatory mechanisms of homeostasis and allostasis. Along with discussion of the free energy principle, we situate these processes of the internal milieu within the external milieu. Ecological niche construction and ecological psychological affordances are related to human practices as means of generating culture. Culture is both a product and producer of the process of valuation, which originates in evolution but is further developed by the free energy principle and social allostasis. The collective languishing exacerbated by the Covid-19 pandemic can be ameliorated through creative resilience so conceived as social allostasis writ large.
... Ecological rationality, built upon the idea of Simonian bounded rationality, is equivalent to adaptive change in the Simonian sense, but focuses primarily on computability problems and how agents use computationally cheap decision procedures instead of computationally costly decision procedures, all other things held equal (Gigerenzer 2000), what theorists of the literature call "fast and frugal heuristics" (Gigerenzer and Todd 1999). Ecological rationality concerns itself with cataloguing environmental and choice contexts to create a language of cues and signals that compresses and reduces the complexity and combinatorial richness of the choice context. ...
Article
Full-text available
We develop a representation of creative evolution in economics based on the theory of the adjacent possible. We start by introducing an epistemological framework for economic theorizing that copes with unknowability and the unlistability of possibility spaces. From this framework, we discuss the use of knowledge in creatively evolving systems and derive four main results: that local knowledge is itself a mechanism of movement through the adjacent possible; that all action is entrepreneurial action; that causality is ambiguous; and that individuals can agree to disagree. We then apply these results to decision-making, innovation, and the emergence of institutions and commons in creatively evolving systems.
... Zaměření na rozvíjení žádoucího místo na eliminaci nežádoucího například koresponduje s neurovědnými zjištěními, že je pro člověka mnohem snazší zaměřit se na něco, co je, než na něco, co není, a že jakákoli představa (zaměřená na žádoucí i nežádoucí) vyvolává odpovídající prožitkový stav (Barrett, 2022;Gilbert, 2007). Význam úspornosti spojené s efektivitou dokumentují výzkumy rychlých a úsporných heuristik (Gigerenzer, 2000) i studie komplexních systémů, které ukazují, že komplexní systémy nelze plně pochopit, nelze v nich stanovovat deterministické příčinné vztahy, ani v plném rozsahu předvídat jejich chování, a přesto je lze smysluplně ovlivňovat na základě relativně jednoduchých procesů (Capra & Luisi, 2016;Cilliers, 2002;Gribbin, 2005;Mitchell, 2011). Rovněž zaměření na proces, spolupráci, práci se změnami (odchylkami) a na improvizaci spojenou s individuálním přístupem má svou oporu v kybernetice a teoriích komplexních systémů (de Shazer, 2017; B. Keeney, 1990Keeney, , 2009H. ...
Article
Full-text available
Párová psychoterapie představuje svébytnou a rozvíjející se oblast psychoterapie. V tomto textu je pozornost věnována jednomu z přístupů k psychoterapeutické práci s páry, a to přístupu zaměřenému na řešení. Cílem textu je představit přístup zaměřený na řešení a možnosti jeho praktické aplikace v kontextu párové psychoterapie. V první části textu jsou představeny principy zvoleného přístupu a základní metody, které lze využít v rámci párové psychoterapie. Jádrem textu jsou dvě oblasti, které jsou zvlášť významné pro práci s páry s využitím přístupu zaměřeného na řešení a které zároveň představují určitou výzvu pro psychoterapeuty využívající přístup zaměřený na řešení v párové psychoterapii. První z uvedených oblastí je způsob, jak v terapeutickém sezení propojovat perspektivy partnerů tak, aby mohly být slyšeny, akceptovány a dále užitečně rozvíjeny v konverzaci. Druhou oblastí, jíž se v tomto textu věnuje detailnější pozornost, je dojednávání preferované budoucnosti (tedy představy žádoucí změny) s partnery. Jsou zde popsány možnosti, jak rozvíjet společnou představu žádoucí změny s partnery i jak pracovat v situacích, kdy jsou představy partnerů ohledně preferované budoucnosti různé, či dokonce protikladné. Text je doplněn příklady z praxe autora i ilustrativními příklady.
... Der Psychologe Gerd Gigerenzer (2000) und seine Kolleg:innen beschäftigen sich seit Jahrzehnten schwerpunktmäßig mit der Psychologie der Entscheidung vor allem im medizinischen Kontext. In zahlreichen Publikationen stellen er und sein Team allen beteiligten Entscheidungsträger:innen, also sowohl Mediziner:innen als auch den Patient:innen, ein geradezu katastrophales Zeugnis bezüglich ihrer Entscheidungskompetenzen aus. ...
Preprint
Full-text available
Problem: Für die Schöpfer:innen und Protagonist:innen des Konzepts der Evidenzbasierten Medizin wird letztere zweifellos als Manifestation der rationalen Entscheidung, quasi als vollkommene Verkörperung des Aufklärungsgedankens aufgefasst. Das erweist sich als Schein. Argumentationslinie: Ohne der Medizin eine Sonderrolle zu unterstellen, kristallisiert sich hier jedoch eine besonders deutliche Ausprägung dessen heraus, was Horkheimer und Adorno im Rahmen ihrer Studien zur "Dialektik der Aufklärung" die "instrumentelle Vernunft" nennen, die als bloß technisch-rationale Vernunft nur noch die Mittel, aber nicht mehr die Ziele von Handlungen reflektiert. Neben der zunehmenden Verengung des Vernunftbegriffs auf seine technische Bedingtheit ist jüngst, verstärkt durch den Hype von maschinellen Lernverfahren mit dem verhöhnenden Namen "Künstliche Intelligenz", eine Ablösung des Vernunftbegriffs von seinem humanistischen Ursprung nur noch eine Frage der Zeit. Ungeachtet der gesamtgesellschaftlichen Bedeutung dieser Problematik, versucht diese Streitschrift den Besonderheiten im Kontext der Evidenzbasierten Medizin gerecht zu werden. Es erscheint "selbstevident", dass sich die Protagonist:innen der Evidenzbasierten Medizin ein Konstrukt der Unanfechtbarkeit zurecht gelegt haben, indem sie in utilitaristischer Weise Entscheidungen auf der Basis vermeintlich höchster Evidenz als ethischen Imperativ begreifen, wobei "höchste Evidenz" im Sinne eines Sprechaktes als gesetzt gilt, aber als zwingend im Sinne einer instrumentellen Vernunft kommuniziert wird. Eine Monopolstellung der Evidenzbasierten Medizin ist die beobachtete Folge, der die medizinisch-wissenschaftliche Community keinen wirksamen Widerstand mehr entgegen zu setzen vermag. Wie Horkheimer und Adorno die Charakteristik der beobachteten Tendenz im "Großen" der Gesellschaft beurteilen, ist auch im "Kleinen" des medizinischen Umfeldes die Tyrannei ein naheliegendes Wesensmerkmal. Konklusion: Ein Herauswinden aus diesem Zirkel der Verdinglichung ist durch eine Besinnung darauf möglich, was Aufklärung, Rationalismus und damit Wissenschaftlichkeit wirklich bedeuten.
... Furthermore, many cognitive mechanisms that natural selection shapes will negatively influence standard scientific research, such as confirmation bias [1], essentialism [2], teleological thinking [3]. Although the type of cognition is irrational to scientific research, some scholars, for example, Gerd Gigerenzer, think of it as ecological rationality because it is an adaptive toolbox by which human ancestors could adapt to the corresponding ecological environment [4]. If scientific cognition is distinct from ecological cognition, we cannot gound the cognitive origin of science on natural selection. ...
Preprint
Full-text available
From the perspective of human cognitive evolution to analyze the origin and sustained development of science, there is a puzzle, because the driving force behind cognitive evolution is natural selection, which favors biological traits improving individual adaptation rather than seeking the truth. Some scholars argue that cultural evolution provides an explanation for the development (or progress) of science, wherein scientific beliefs can be transmitted intergenerationally within the scientific community, thus continually converging toward truth. However, this explanation has ecological validity problem. Research on cultural learning strategies suggests that when choosing beliefs for imitation learning, people tend to consider contextual and content factors rather than the truth value of beliefs. On the other hand, due to the absence of universally accepted criterion for theory selection recognized by all scientists, the best theory choice always fails. I will develop a minimum model for the best theory choice to defend the scientific progress's cultural evolution approach. The criterion for the best theory choice is the mark of a scientific school's establishment and the demarcation line with each other. As a scientific school's member, a scientist could select the best theory as his/her learning model according to the school's criterion. Hence, it is plausible for the approach to explain scientific progress. Finally, my model will show a scientific school is a driver to improve the local scientific progress, and the competition between schools promotes global scientific progress. The school's size and the scientific reasoning diversity decide a result of the competition.
... Je nachdem, ob sich einer der genannten oder auch ein nicht genannter Faktor in den Vordergrund schiebt, wird die Intuition dem Gesamtgeschehen eventuell nicht mehr gerecht und führt zu einer falschen Entscheidung bzw. zu einem nicht wirksamen oder gar schädlichen TherapeutInnenverhalten (Gigerenzer, 2000(Gigerenzer, , 2007. ...
Article
Full-text available
Ausgehend von der gegenwärtigen Diskussion zum Berufsbild der Psychologischen PsychotherapeutInnen (Sachse, Fasbender & Hammelstein, 2014; Amrhein, 2014; Fliegel, 2014; Strauß & Nodop, 2014; Sulz, 2014a-d) wird der Unterschied zwischen Psychotherapie und Wissenschaft herausgearbeitet. Dabei wird eine Wissenschaftsdiskussion unvermeidlich, in der der dominierende Stellenwert der RCT-Forschung hinterfragt wird (Henry, 1998; Kriz, 1996, 2000, 2007, 2010, 2014; Revenstorf, 2005, 2014; Sulz, 2014d). Die Notwendigkeit der Öffnung für andere Forschungsparadigmen und Forschungsmethoden wird aufgezeigt (wie Feldstudien, qualitative Forschung, Hermeneutik). Das führt dazu, dass der bestmögliche Ort der Psychotherapieausbildung nicht in der Universitätspsychologie zu finden ist, sondern dort, wo Psychotherapie gemacht wird und es LehrerInnen (DozentInnen und SupervisorInnen) gibt, die erfahrene PsychotherapeutInnen sind (Sulz, 2014b,c; Sulz, Richter-Benedikt & Hebing, 2014; Sulz & Backmund-Abedinpour, 2014; Sulz & Hoenes, 2014). Einem mit 18 Jahren beginnenden Universitätsstudium werden keine Chancen eingeräumt, den notwendigen Rahmen für die Entwicklung einer ausreichend gereiften Therapeutenpersönlichkeit mit den erforderlichen Kompetenzen zu bieten (Sulz & Sichort-Hebing, 2014). Dabei kommt Selbsterfahrung und Supervision ein sehr großer Stellenwert zu (Hill, 2013).
... Прагматика науки сопряжена с отсеиванием лишних гипотез, эвристическими факторами ускоренного перебора вариантов, финитизации этого процесса и т.п. (примером могут служить работы Г. Гигеренцера) [5], [6], [7]. ...
Article
Full-text available
The article deals with the problem of empirical confirmation of scientific hypotheses. The problem of confirmation remains one of the most debated topics in philosophy and methodology of sciences. It is concluded that the confirmation of hypotheses is not reducible to the sphere of logic; empirical confirmation of hypotheses is a complex process associated with the pragmatic parameters of scientific activity.
... People's general preference for intuitive FEAR AND INFORMATION PROCESSING 6 processing has been associated with various psychological and behavioral phenomena, such as errors and biases in judgment and decision-making (Bakken et al., forthcoming;Gilovich et al., 2002;Kahneman, 2003;Mahoney et al., 2011;Shiloh et al., 2002), conspiracy beliefs (Barron et al., 2018), susceptibility to misinformation (Lazarević et al., 2021), prosociality and morality (Liang et al., 2021), stereotyping (Trent & King, 2013), ingroup bias (Kołeczek et al., 2022), and even criminal behavior (McClanahan et al., 2019). Intuition has also been hailed as an adaptive tool in settings that involve limited time and information (Bakken et al., forthcoming;Gigerenzer, 2000;Klein, 2015;Klein & Crandall, 1995). ...
Research
Full-text available
This paper critically re-examines the existence of rational decision-making within real-world and organizational contexts, moving beyond traditional normative and bounded rationality theories. Drawing from a social constructivist lens, it proposes that rationality does not merely pre-exist in nature but is constructed and performed through a triad of components: conventions, manufactured processes, and commodities. The study presents how tools, theoretical assumptions, and actor behavior work together to make rationality operational via performativity. By tracing how rational choice theory is enacted through educational norms, decision-making tools, and the consulting industry, the paper underscores that rationality persists not as an ideal, but as a socially sustained practice. This approach provides a compelling alternative to the dichotomy of normative vs. descriptive decision theories and opens new avenues for empirical inquiry into how rational behavior is constructed and institutionalized in organizations.
Article
Full-text available
Heuristics, characterized as concise cognitive shortcuts rooted in intuitive reasoning, are both capable of facilitating swift judgments and cognitive efficiency, but also introducing cognitive biases during decision-making. The judicial domain, renowned for its demanding decision-making processes, is an interesting field for studying heuristics. In this study, we developed a novel Judicial Heuristics Assessment Questionnaire (J-HAQ) and administered it to a sample of 52 judges (20 males, Mage = 45.50, SD = 8.10), with active duty in various courts across Greece. We also evaluated their analytical System 2 thinking skills using the Cognitive Reflection Test (CRT). This research pursued three objectives: (a) to explore the psychometric properties of the J-HAQ; (b) to investigate the correlation between judges' perceived use of heuristics/metacognitive awareness and their objective performance on reflective thinking; (c) to assess the correlation of self-reported usage of different heuristics and explore the influence of judges' demographics (educational level, gender, age, and years of experience) in the utilization of the reported heuristics in decision-making. Findings from a Principal Component Analysis on J-HAQ scores revealed four distinct factors (Availability, Confirmation Bias, Representativeness, and Anchoring) demonstrating sufficient reliability. We also report a significant correlation between CRT scores and reported use of the anchoring heuristic (ρ = 0.29, p = 0.04). Finally, we discovered two clusters defined by different awareness of the use of various heuristics, as well as significant association of educational level with this usage. Despite the limitations of a relatively small sample size, these findings reveal a dynamic for further interesting results from research in this domain.
Chapter
This chapter explores the perspectives of various psychologists on significant global and psychological issues. Peter Suedfeld emphasizes the importance of rigorous scientific methodology in psychology, predicting a focus on positive psychology and interdisciplinary connections. Lewis P. Lipsitt expresses concern over overspecialization within psychology but notes increasing collaboration between basic and applied research. From the Global South, Isidore S. Obot highlights the challenges of poverty, ignorance, and disease, stressing the importance of public policy in addressing these issues. Other participants discuss topics such as human welfare, coping mechanisms, the psychological impact of technology, and the importance of understanding historical influence and leadership. Overall, the chapter underscores the need for a scientific approach to addressing complex, global challenges, with psychology playing a critical role in shaping future developments.
Article
Agent-based modeling (ABM) is a novel computational methodology for representing the behavior of individuals in order to study social phenomena. Its use is rapidly growing in many fields. We review ABM in economics and finance and highlight how it can be used to relax conventional assumptions in standard economic models. ABM has enriched our understanding of markets, industrial organization, labor, macro, development, public policy, and environmental economics. In financial markets, substantial accomplishments include understanding clustered volatility, market impact, systemic risk, and housing markets. We present a vision for how ABMs might be used in the future to build more realistic models of the economy and review some of hurdles that must be overcome to achieve this. (JEL C63, D00, E00, G00)
Article
Full-text available
Kiinnostus rakenneyhtälömalleja kohtaan on kasvanut käyttäytymistieteellisessä tutkimuksessa. Rakenneyhtälömalleja käyttävä tutkija voi laatia teoriaperusteisen tilastollisen mallin, joka yhdistää suorien havaintojemme ulkopuolella olevat teoreettiset käsitteet tutkijan keräämään aineistoon, ja testata, vastaako malli aineistoa. Tällaisen konfirmatorisen analyysin lisäksi aineistoa on mahdollista tutkia myös eksploratiivisesti. Tällöin tutkija voi esimerkiksi mallintaa aineiston heterogeenisyyttä etsimällä aineistossa piileviä ryhmiä. Rakenneyhtälömallinnuksen monipuolisuutta kuvaa se, että konfirmatorisia ja eksploratiivisia analyyseja voidaan tehdä muuttuja- tai henkilökeskeisesti tai niitä yhdistelemällä. Rakenneyhtälömallinnuksen yleistyessä on kasvanut tarve ymmärtää, miten teoreettiset käsitteet yhdistetään malliin ja mitä tieteenfilosofisia oletuksia tähän sisältyy. Tämän artikkelin tarkoituksena on esitellä käsitteellinen malli, jonka avulla käyttäytymistieteen tutkija voi tarkastella mallinnettavan teoreettisen käsitteen (ilmiön), aineiston, tilastollisen mallin ja sisältöteorian välisiä suhteita. Käsitteellisen mallin taustalla on tieteellinen realismi. Sen mukaan todellisuudessa on sekä suoraan havaittavia että mielistämme riippumattomia suoran havainnon saavuttamattomissa olevia ilmiöitä, joita on mahdollista ja mielekästä tutkia. Artikkelin lopuksi esittelemme, miten empiiristä tutkimusta laadittaessa tehdyt tutkimukselliset ratkaisut kytkeytyvät käsitteelliseen malliimme.
Article
Amaç-Ekolojik zekâya sahip bireyler ihtiyaçlarını çevreye daha az zarar vererek karşılayabilirler ve böylece ekosistemin sürdürülebilirliğine katkı sağlayabilirler. Sürdürülebilir tüketimin bir yaşam tarzı haline gelmesinde ekolojik zekânın önemli etkisi vardır. Buradan hareketlebu çalışmanın amacı, ekolojik zekâ ile sürdürülebilir tüketim davranışı arasındaki ilişkinin incelenmesidir.Yöntem-Çalışmada veri toplama aracı olarak anket kullanılmıştır. Araştırmanın örneklemi 643 kişiden oluşmaktadır ve örneklem yöntemi olarak kolayda örnekleme tercih edilmiştir. Verilerin analiz edilmesinde SPSS ve SmartPLS paket programlarından faydalanılmıştır.Bulgular-Ekolojik zekânın ekonomi ve sosyal boyutu ile sürdürülebilir tüketimin çevre duyarlılığı, ihtiyaç dışı satın alma, tasarruf ve yeniden kullanılabilirlik boyutlarının bileşik güvenilirlik değerleri 0,823 ile 0,914 arasında; Cronbach’s alpha ise 0,683 ile 0,877 arasında değerler almaktadır.Tartışma-Araştırma sonucuna göre ekolojik zekânın sosyal boyutundaki artış, sürdürülebilir tüketim davranışının çevre duyarlılığı, yeniden kullanılabilirlik ve tasarruf boyutunu olumlu yönde etkilemekte; ihtiyaç dışı satın alma davranışı boyutunda ise azalışa sebepolmaktadır. Bu sonuçlar literatürdeki çalışmaların sonuçlarıyla benzerlik göstermektedir. Ekolojik zekânın ekonomi boyutuyla sürdürülebilir tüketim davranışı arasında ise istatistiksel açıdan anlamlı bir ilişki bulunmamıştır. Bu sonuç ise literatürdeki çalışmalardan farklılık göstermektedir.
Thesis
This research addresses interactions in urban environments—when defined as interaction types and used as a metric—as they provide an effective strategy for understanding the relationship between urban heterogeneity and how we are visually sustained through engagement with our surroundings. With most of us now living in cities, the importance of visual sustainability in urban design strategy should not be underestimated. Interaction type analysis is key to bridging the gap in knowledge which lies in the challenges posed by the levels of subjectivity inherent in how we see, what we see, and the difficulty in measuring visual sustainability. The aim of this study is to explore the philosophy behind how we are sustained by what we see and its relevance to urban design. By using a mixed methods approach, the research shows how a practical application of Bergson’s philosophy can be reconciled with urban design at a strategic level to establish an operational logic for understanding urban environments, one which does not require us to identify the meaning or even what it is people have looked at. The findings suggest that urban density plays less of a role than we might expect and what is more influential are the elements that hold people’s attention, in other words, levels of urban activity. The variables comparison points to the proposal that interaction types have a role to play in urban design strategy. To understand visual sustainability better we need to understand three things. Firstly, the role duration plays in the types of interaction we have with our surroundings. Secondly, how elements that we cannot see exist on every site, and are important in understanding not only existing conditions properly but the potential for development. Thirdly, that these elements that we cannot see are valid, real structures—as real as the physical structures which act as proxies for them. The main finding of this study is the suggestion that visual interaction types are the building blocks of visual sustainability when considered in the context of urban design strategy. What difference this makes depends on the level of analysis—whether student or practitioner, commercially oriented, in terms of spatial health and well-being, or at a more abstract level, in personal development and growth. But the overarching consideration is that interaction types are able to reveal where the real city lies and by real city is meant the city we pay attention to. The emphasis going forward must be on an effective implementation of urban design strategy by including interaction type because, as city dwellers, it is we who stand to benefit the most.
Article
Full-text available
In science and beyond, quantifications are omnipresent when it comes to justifying judgments. Which scientific author, hiring committee-member, or advisory board panelist has not been confronted with page-long publication manuals, assessment reports, evaluation guidelines, calling for p-values, citation rates, h-indices, or other numbers to judge about the ‘quality’ of findings, applicants, or institutions? Yet, many of those of us relying on and calling for quantifications may not understand what information numbers can convey, and what not. Focusing on the uninformed usage of bibliometrics as worrisome outgrowth of the increasing quantification of science, in this opinion essay we place the abuse of quantifications into historical contexts and trends. These are characterized by mistrust in human intuitive judgment, obsessions with control and accountability, and a bureaucratization of science. We call for bringing common sense back into scientific (bibliometric-based) judgment exercises. Despite all number crunching, many judgments—be it about empirical findings or research institutions—will neither be straightforward, clear, and unequivocal, nor can they be ‘validated’ and be ‘objectified’ by external standards. We conclude that assessments in science ought to be understood as and be made as judgments under uncertainty.
Article
Full-text available
This paper presents the theoretical background and empirical findings from two longitudinal experimental research projects conducted by Toshio Yamagishi, which aim to elucidate the evolutionary, socio-cultural, and neuroscientific foundations of human prosociality as social niche constructors. These studies explore how social environments are shaped and sustained by the social behaviors of individuals and how these behaviors are underpinned by neurological and socio-cultural mechanisms.
Article
In abductive reasoning, scientific theories are evaluated on the basis of how well they would explain the available evidence. There are a number of subtly different accounts of this type of reasoning, most of which are inspired by the popular slogan 'Inference to the Best Explanation.' However, these accounts disagree about exactly how to spell out the slogan so as to avoid various problems for abductive reasoning. This Element aims, firstly, to give an opinionated overview both of the many accounts of abductive reasoning that have been proposed and the problems that have motivated them; and, secondly, to critically evaluate these accounts in a way that points toward a systematic view of the nature and purpose of abductive reasoning in science. This title is also available as Open Access on Cambridge Core.
Book
Full-text available
Modern institutional economics was created to study the institutions of pre-digital economies and is based on reductionist approaches. But digital capitalism is producing institutions of unprecedented complexity. This book argues, therefore, that not only the economic institutions themselves but also the theoretical foundations for studying those institutions must now be adapted to digital capitalism. The book focuses on the institutional complexity of digital capitalism, developing an interdisciplinary framework which brings together cutting-edge theoretical approaches from philosophy (first of all, object-oriented ontology), sociology (especially actor–network theory), evolutionary biology, and cognitive science. In particular, the book outlines a new approach to the study of institutional evolution, based on extended evolutionary synthesis – a new paradigm in evolutionary biology, which is now replacing neo-Darwinism. The book develops an enactivist notion of extended cognition and cognitive institutions, rejecting the individualistic and mechanistic understanding of economic rationality in digital environments. The author experiments with new philosophical approaches to investigate institutional complexity, for example, the ideas of the flat ontology and the assemblage theory. The flat ontology approach is applied to the study of human–robot institutions, as well as to thinking about post-anthropocentric institutional design. Assemblage thinking allows for a new (much less idealistic) look at blockchain and smart cities. Blockchain as digital institutional technology is considered in the book not from the viewpoint of minimizing transaction costs (as is customary in the modern institutional economics), but by using the theory of transaction value which focuses on improving the quality of digital transactions. The book includes a wide range of examples ranging from metaverses, cryptocurrencies, and big data to robot rules, smart contracts, and machine learning algorithms. Written for researchers in institutional economics and other social sciences, this interdisciplinary book is essential reading for anyone interested in the interplay of institutional and digital change.
Article
Full-text available
This interdisciplinary study, coupling philosophy of law with empirical cognitive science, presents preliminary insight into the role of emotion in criminalization decisions, for both laypeople and legal professionals. While the traditional approach in criminalization theory emphasizes the role of deliberative and reasoned argumentation, this study hypothesizes that affective and emotional processes (i.e., disgust, as indexed by a dispositional proneness to experience disgust) are also associated with the decision to criminalize behavior, in particular virtual child pornography. To test this empirically, an online study (N = 1402) was conducted in which laypeople and legal professionals provided criminalization ratings on four vignettes adapted from criminal law, in which harmfulness and disgustingness were varied orthogonally. They also completed the 25-item Disgust Scale-Revised (DS-R-NL). In line with the hypothesis, (a) the virtual child pornography vignette (characterized as low in harm, high in disgust) was criminalized more readily than the financial harm vignette (high in harm, low in disgust), and (b) disgust sensitivity was associated with the decision to criminalize behavior, especially virtual child pornography, among both lay participants and legal professionals. These findings suggest that emotion can be relevant in shaping criminalization decisions. Exploring this theoretically, the results could serve as a stepping stone towards a new perspective on criminalization, including a “criminalization bias”. Study limitations and implications for legal theory and policymaking are discussed.
Chapter
From the perspective of a behavioral economic theory of lobbying, representation of special interests starts earlier than in a rational choice theory of lobbying. It already plays a role when it comes to the formation of political preferences, and it starts, for example, when expectations are formed under uncertainty in the political debate. Advocacy can also take advantage of phenomena such as conflicting and ambiguous individual preferences. This chapter discusses the specifically behavioral economics view on politics and how phenomena such as expressive political behavior, framing, and the deliberate exploitation of heuristics and biases can provide levers for special interest advocacy. The theoretical considerations are illustrated by two case studies.
Preprint
Full-text available
Subconscious retro-prediction in conjunction with brain state update vetting cycles are instrumental in the generation of higher-order conscious perceptions and in all abstract thought. Support for this hypothesis is provided by conducting a physiological re-evaluation of the self-referential statements in set theory and formal logic known as antinomies. The overall analysis concludes that the cyclical behavior exhibited by recursive enigmas such as "Russell's Paradox" and the "Barber Paradox, result from a holistic categorical information processing malfunction and not purely the result of a logical failing due to the unrestricted use of the axiom of comprehension.
Chapter
In recent years we have faced huge uncertainty and unpredictability across the world: Covid-19, political turbulence, climate change and war in Europe, among many other events. Through a historical analysis of worldviews, Peter Haldén provides nuance to the common belief in an uncertain world by showing the predictable nature of modern society and arguing that human beings create predictability through norms, laws, trust and collaboration. Haldén shows that, since the Renaissance, two worldviews define Western civilization: first, that the world is knowable and governed by laws, regularities, mechanisms or plan, hence it is possible to control and the future is possible to foresee; second, that the world is governed by chance, impossible to predict and control and therefore shocks and surprises are inevitable. Worlds of Uncertainty argues that between these two extremes lie positions that recognize the principal unpredictability of the world but seek pragmatic ways of navigating through it.
Chapter
Sensemaking processes are regarded as a relevant conceptualization of how users interact with information visualizations. Nevertheless, there is little research about the specific sensemaking strategies users adopt when they work with visualizations. Psychological theories about human thinking and reasoning and theories from the area of graph comprehension are relevant approaches that should be taken into account when investigating sensemaking processes with information visualizations. In these areas, there is more detailed research about problem-solving strategies (e.g., in mathematical problem-solving) that could be relevant for information visualization. We provide an overview of interesting approaches and in which way they are relevant for interactions with visualizations. We describe an exploratory investigation with 18 computer science students performing a realistic task using a visual analytics system. The result of this investigation was a set of eleven sensemaking strategies. We discuss whether these strategies can be generalized across different visualizations and compare the results to results from other studies we have conducted in this area. We also present examples for recommendations based on such research.
Article
Full-text available
It is proposed that reasoning about social contracts, such as conditional promises and warnings, is under the control of a compound schema made of two pragmatic schemas (Cheng & Holyoak, 1985), expressing an obligation and a permission. Two experiments were run using thematic versions of the Wason selection task in which the rule and the core of the scenario were kept constant and the point of view of the actor (e.g. promisor or promisee) was varied. The results supported the predictions (including the occurrence of a correct pattern of response that consists of all four cards) and falsified predictions derived from Cosmides' (1989) theory of social exchange. The mental models theory and Evans' two-stage theory of reasoning are also discussed in the light of the present results.
Article
Full-text available
Proceedings of the Nineteenth Annual Meeting of the Berkeley Linguistics Society: General Session and Parasession on Semantic Typology and Semantic Universals (1993)
Article
Full-text available
True to their name, vampire bats consume from 50 to 100 percent of their body weight in blood every night. A bat who fails to feed will perish in two days - unless it can solicit food from a roostmate. The key to survival for these animals is an elaborate system of food sharing, which the author finds is based on the principle of reciprocal altruism.
Article
Full-text available
K. J. Gergen's (1982) argument that hypotheses in social psychology are not empirical propositions is critically examined and shown to be erroneous. Nevertheless, this article demonstrates that, without necessarily appearing obvious, some hypotheses can be derived from propositions that are like tautologies and that their confirmation as such is of little interest. An analysis of hypotheses in recent articles in the Journal of Experimental Social Psychology and the Journal of Personality and Social Psychology suggests that hypotheses derivable from propositions very much like tautologies may not be infrequent. Implications are considered for what kinds of social psychology experiments are of value to perform. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Describes 3 experiments in which housewives (N = 142) estimated the probability (p) of the presence of a disease which had been indicated by diagnostic equipment. Although data given to Ss indicated p .5, Ss consistently estimated p .8. Successive trials altered the order of presentation of data and progressively reduced the data given. However, Ss always gave closely similar p values, accompanied by high confidence ratings. 2 hypotheses are examined to account for these findings. A 3rd experiment suggests the conclusion that the most important factor is that Ss import a rigid prior probability from their previous experience and ignore numerical data. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Demonstrated, using an interference paradigm, that there is a place for a direct coding mechanism in a comprehensive theory of frequency coding. Ss were presented words whose frequency was judged later. Under one set of instructions, these words were coded in terms of numerical associates; under another set of instructions, the coding was governed by nonnumerical associates. The condition using numerical associates resulted in frequency estimations that were of lesser quality than those produced in the control condition. This effect, moreover, was a function of the encoding of the target words, not just their retrieval. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
PREVIOUS EXPERIMENTS ON HUMAN INFERENCE FOUND SS TO BE CONSERVATIVE IN THEIR REVISION OF PROBABILITY ESTIMATES. IT WAS HYPOTHESIZED THAT SUCH INFERENCES ARE DUE TO CONSERVATIVE EXPECTATIONS ABOUT THE PROCESS OF SAMPLING DATA FROM TARGET POPULATIONS. SS WERE MALE UNDERGRADUATES. IN EXP. I, ESTIMATED BINOMIAL SAMPLING DISTRIBUTIONS BECAME INCREASINGLY CONSERVATIVE, TOO FLAT, AS THE SIZE OF THE SAMPLE INCREASED AND AS BINOMIAL PROBABILITIES DEPARTED FROM .5. THESE EFFECTS ARE IN THE SAME DIRECTION AS THOSE PREVIOUSLY REPORTED FOR CONSERVATISM IN REVISION OF PROBABILITY ESTIMATES. IN EXP II, SS MADE INFERENCES BY REVISING PROBABILITIES IN LIGHT OF NEW DATA AND ALSO ESTIMATED BINOMIAL SAMPLING DISTRIBUTIONS. PROBABILITY REVISIONS WERE BETTER PREDICTED BY ESTIMATED RATHER THAN THEORETICAL BINOMIAL SAMPLING DISTRIBUTIONS. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
A statistical test leads to a Type I error whenever it leads to the rejection of a null hypothesis that is in fact true. The probability of making a Type I error can be characterized in the following 3 ways: the conditional prior probability, the overall prior probability, and the conditional posterior probability. In this article, we show (a) that the alpha level can be equated with the 1st of these and (b) that it provides an upper bound for the second but (c) that it does not provide an estimate of the third, although it is commonly assumed to do so. We trace the source of this erroneous assumption first to statistical texts used by psychologists, which are generally ambiguous about which of the 3 interpretations is intended at any point in their discussions of Type I errors and which typically confound the conditional prior and posterior probabilities. Underlying this, however, is a more general fallacy in reasoning about probabilities, and we suggest that this may be the result of erroneous inferences about probabilistic conditional statements. Finally, we consider the possibility of estimating the (posterior) probability of a Type I error in situations in which the null hypothesis is rejected and, hence, the proportion of statistically significant results that may be Type I errors. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Evolutionary approaches to judgment under uncertainty have led to new data showing that untutored subjects reliably produce judgments that conform to many principles of probability theory when (a) they are asked to compute a frequency instead of the probability of a single event and (b) the relevant information is expressed as frequencies. But are the frequency-computation systems implicated in these experiments better at operating over some kinds of input than others? Principles of object perception and principles of adaptive design led us to propose the individuation hypothesis: that these systems are designed to produce well-calibrated statistical inferences when they operate over representations of "whole" objects, events, and locations. In a series of experiments on Bayesian reasoning, we show that human performance can be systematically improved or degraded by varying whether a correct solution requires one to compute hit and false-alarm rates over "natural" units, such as whole objects, as opposed to inseparable aspects, views, and other parsings that violate evolved principles of object construal. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
Ten reasons are proposed to explain why Kelley’s attribution theory and related questionnaire research fail to account for the cognitive processes underlying ordinary explanations of behaviour. In this first paper, five salient conceptual problems of the ANOVA model are reviewed: the inadequate analysis of behaviours and their contexts, and the inferences to which they give rise; deployment of the vague and ambiguous internal-external distinction; the normative view of people as poor social scientists; the crude analogy between ordinary reasoning and statistical procedures; and, fundamentally, the naive conceptualization of cognitive processes. Schank’s computational theory of social inference is identified as a sophisticated process model which avoids the conceptual problems inherent in the ANOVA model.
Chapter
“Social Cognition” ist ein Schlagwort, das in der Sozialpsychologie in den letzten Jahren eine große Bedeutung gewonnen hat. Die Charakterisierung von “Social Cognition” als “Sozialpsychologie innerhalb des Paradigmas der Informationsverarbeitung” beinhaltet bereits einige Annahmen, die erläuterungsbedürftig sind. Es ist zu klären, was unter dem Begriff “Paradigma” verstanden werden soll und was mit “Informationsverarbeitung” gemeint ist.
Article
Screening for anti-HIV 1 revealed 262 donors with reactive results that were reproducible from donation to donation whereas their immunoblots (Western blot) were non-reactive or doubtful. This series was compared with 92 donors who had reactive results for anti-HIV 1+2 that were reproducible from donation to donation. The results of series A could not be confirmed as a rule by test system B. The identity of the detected (cross-reacting) antibodies is not known. Cross-reactions against other viral antigens as well as disturbances of the test system by immune complexes and antinuclear factors are suspected to be responsible for the appearance of these antibodies. A solution of these problems should be endeavored not only with regard to transfusion safety but also with regard to the rehabilitation and potential re-admission of barred donors.
Article
This paper attempts further to explicate and justify the belief, held by a number of critics of mainstream psychology, that much customary empirical research tells one little that could not have been known without it. Apart from questions of tautology or indeterminate relations to observation, many hypotheses are derivable from propositions that are unfalsitiable because they cannot he tested without relying on conceptualizations which imply the propositions themselves. Experiments that serve no purpose beyond the operationalization of such hypotheses are a misguided enterprise.
Article
This chapter introduces social judgment theory (SJT). It focuses upon the conceptual structure of the framework and traces its development from the roots in Brunswik's probabilistic functionalism to its present form. SJT is a general framework for the study of human judgment. It is a metatheory which gives direction to research on judgment. SJT is the result of a systematic application of Brunswik's probabilistic functionalism to the problem of human judgment in social situations. Brunswik's theory of perception is also called “cue theory.”According to such a theory, a person does not have access to any direct information about the objects in the environment. Instead, perception is seen as an indirect process, mediated by a set of proximal cues. In accordance with this view, SJT defines judgment as a process which involves the integration of information from a set of cues into a judgment about some distal state of affairs. The lens model illustrates an important methodological principle in SJT: the Principle of Parallel Concepts. This principle states that the cognitive system and the task system must be described in terms of the same kinds of concepts.
Chapter
The fundamental problems for research in social judgment theory (SJT) are to understand and improve achievement and agreement. Descriptions of judgment must contribute to the solution of these problems. This chapter explores the learning from studies of policy capturing about those factors that affect achievement and agreement. The typical policy capturing study does not vary any characteristics of the task, but presents the same task in the same form to all subjects in the study. Policy capturing is used as a general term for studies that analyze judgments made on the basis of multidimensional stimuli by means of a linear model. Policy capturing implies that the subjects have some policy that can be captured, and that the conditions under which the subjects are asked to make their judgments are related to the conditions under which they normally do so. A task may be nonrepresentative in at least two different ways—namely, (1) with respect to the format of the information provided to the subjects, and (2) with respect to the formal statistical properties of the task.
Chapter
This chapter discusses various uses of connectives and examines whether such uses are content- and context-sensitive. It is clear that a monologic account of propositional thinking related to truth-functional logic, which has been the favored approach of psychologists, is insufficient if one considers the uses of the connectives in everyday speech. In addition, a dialogic—a rhetorical approach—that focuses on the speech act and considers what the speaker means by relevant interpersonal context is necessary. Such an account provides an analysis of the way people manage to go beyond what is explicitly said to understand the communications addressed to them. Apart from this, the conceptual apparatus for an account of indirect communication includes a theory of speech that relates with the principles of cooperative conversation, mutually shared background information, and the ability of the addressee to make inferences. The chapter constitutes a study of lexical semantics that demonstrates the contextual control of word use and understanding.
Article
The archaeology of the past two decades has become increasingly quantitative, computerized, statistical, and this is as it should be. All right-thinking archaeologists begin with samples and attempt to generalize about the populations from which their samples were drawn. Statistical theory has evolved to assist investigators in making just this important inferential step and archaeologists have increasingly turned to statistics to square their research with the canons of Science. But the statistical revolution in archaeology is not without its price. We must now face the fact that all applications of statistics to archaeology can no longer be applauded. The archaeological literature is badly polluted with misuses and outright abuses of statistical method and theory. This paper discusses some of these faulty applications and makes some recommendations which, if heeded, should improve the quality of quantitative methods in archaeology.
Article
The question of the generalizability of laboratory experiments to the "natural settings of ordinary people" was investigated in a case study on the frequency-validity relationship. Previously advocated by John Locke and David Hartley, this relationship states that the mere repetition of plausible but unfamiliar assertions increases the belief in the validity of the assertions, independent of their actual truth or falsity. The external validity of this relationship was tested for a random sample drawn from telephone listings of adults living in Schwabing, Munich. Subjects were tested in their homes rather than in a university laboratory. The increase in mean validity judgments by repetition, its independence from actual truth or falsity, as well as the absolute and relative size of the effect were found to be in excellent agreement with previous laboratory findings. The external validity of the frequency-validity relationship would therefore seem to be demonstrated. In addition, the relationship seems independent of the intersession intervals, the time intervals between the assertions, and the sex of the person making the assertions. This result is consistent with the hypothesis of "automatic" processing of frequency.
Article
The rationale of customary "null hypothesis testing" procedures of statistical inference is examined. This approach is not incorrect, but it is prone to misuse and misinterpretation, including neglect of "power" and inappropriate conclusions based on conventional significance levels. The estimation approach, which often seems preferable, is briefly described. The kind of reasoning involved in statistical inference is required whenever we wish to assess the evidence relevant for or against any general proposition, whether we make any formal computations or not, and whether or not we have observed all possible real instances of relevant evidence. Statistical inference is logically unproblematic if we interpret it as a way of assessing the evidence more clearly. But statistical results cannot be directly converted into probabilities of the truth of hypotheses. This requires additional assumptions about appropriate probabilities of the hypotheses prior to consideration of the research evidence.
Article
Adiscussion of the editorial policies and philosophy over the past 12 years of the author's editorship of the Journal of Experimental Psychology. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
In 1938, R. S. Woodworth published a seminal textbook, "Experimental Psychology," known popularly as the "Columbia Bible." This article examines the origins of the book in early mimeographed versions, the circumstances of its publication, and its reception by the academic community. In the 1938 version of the text, Woodworth narrowed the definition of an "experiment" to the active manipulation of an "independent variable," and he clearly excluded mental testing from experimental psychology. Woodworth's 1938 definition of experiment became nearly universal in psychology textbooks, and this change had important implications for psychology.
Article
Most of people′s apparent strategies for covariation assessment and Bayesian inference can lead to errors. However, it is unclear how often and to what degree the strategies are inaccurate in natural contexts. Through Monte Carlo simulation, the respective normative and intuitive strategies for the two tasks were compared over many different situations. The results indicate that (a) under some general conditions, all the intuitive strategies perform much better than chance and many perform surprisingly well, and (b) some simple environmental variables have large effects on most of the intuitive strategies′ accuracy, not just in terms of the number of errors, but also in terms of the kinds of errors (e.g., incorrectly accepting versus incorrectly rejecting a hypothesis). Furthermore, common to many of the intuitive strategies is a disregard for the strength of the alternative hypothesis. Thus, a key to better performance in both tasks lies in considering alternative hypotheses, although this does not necessarily imply using a normative strategy (i.e., calculating the φ coefficient or using Bayes′ theorem). Some intuitive strategies take into account the alternative hypothesis and are accurate across environments. Because they are presumably simpler than normative strategies and are already part of people′s repertoire, using these intuitive strategies may be the most efficient means of ensuring highly accurate judgment in these tasks.
Article
The English mathematician Charles Babbage, who during the first half of the nineteenth century invented the precursors of today's computers, was keenly interested in the economic issues of the Victorian era. His calculating engines were an application of contemporary theories on the division of labour and provided models for the rationalisation of production. Bahhage's ideas contributed to the dehumanisation of labour hut were also the source of major discoveries. The mathematician's history was closely linked to that of the industrial revolution, cradled in England, the 'workshop of the world'. This article recalls the effervescence of that period.
Article
Peter Wason invented the Selection Task in 1966. Thirty years and many, many experiments later, two results are evident for me. First, the view that sound reasoning can be reduced to propositional logic (or first order logic) is myopic. Human thought operates in more dimensions than entailment and contradiction (Strawson, 1952). We need to work out how the mind infers the meaning of a conditional statement from the content of the Ps and Qs and the social context in which the statement is uttered, rather than exclaiming “Cognitive illusion! Hurrah! Error!” whenever human reasoning cannot be reduced to propositional logic. Second, the hope that these “errors” or their flip-side, the “facilitation” of logical reasoning, would be the royal road to discovering the laws of human reasoning did not materialise. This hope was fueled by the (mislead
Article
Social cognition is established to serve as a new forum for the ever increasing numbers of investigations. This chapter examines the current theories of social cognition and the intellectual history out of which social cognition has arisen. It also reveals the pervasive and continuing influence of two generic traditions within psychology: associationism and constructionism. It is salutary in the development of social cognition to explore its intellectual history, its associationist, and its constructionist traditions to address the question of whether or not social cognition represents an important theoretical advance. In the chapter, the roots of associationism as well as its present application in social cognition research and the roots of constructionism and its present application in social cognition research are also explored. The central concerns of cognitive psychology are representation and processing. These issues require the building and testing of theoretical constructs and models. The formation or processing of representations is directly observed and introspection regarding cognitive processes is problematic. The chapter also focuses on those investigations of social cognition that explicitly apply associationist or constructionist theory.
Article
The empiricism of eighteenth-century experimental science meant that the development of scientific instruments influenced the formulation of new concepts; a two-way process for new theory also affected instrument design. This relationship between concept and instrumentation will be examined by tracing the development of electrical instruments and theory during this period. The different functions fulfilled by these devices will also be discussed. Empiricism was especially important in such a new field of research as electricity, for it gave rise to phenomena that could not have been predicted by theory alone. However, the interpretation of these phenomena, and what the natural philosopher thought he observed, were often unconsciously determined by current ideas and attitudes; the interaction between instrumentally induced phenomena and observation was more complex than was realized at the time. The shortcomings of this empirical approach will be discussed. In the case of electricity this became increasingly apparent during the latter part of the century. The many discoveries had to be placed in a unifying framework before new advances could be made. Instruments, however, continued to play an important role in scientific progress, for they made visible what was hidden in nature.
Article
After extensive training, subjects had to assess subjective probability distributions for uncertain quantities. Some of these quantities served as posterior probabilities, others as likelihoods for which, by Bayes' theorem, posterior probabilities could be derived. Two different kinds of evaluation criteria were used: The first refers to the goodness of individual assessments, defined as the score of a proper scoring rule. The second criterion refers to the information inherent in the assessments. The question is whether posterior probabilities derived from assessed likelihoods would lead to better results than the direct estimation of posterior probabilities. With respect to the first criterion, evaluation of individual assessments, subjects were rather poor in their performance and did not improve over repeated sessions. Furthermore, there were no differences in performance for the different kinds of assessments made. But if the second criterion is used, posterior probabilities derived from assessed likelihoods turned out to be quite good. The direct assessment of posterior probabilities was always inferior. Implications for the training of probability assessors and the construction of information processing aids are discussed.
Article
In their commentary on Lopes (1991), Oaksford and Chater (1992) assume that (1) the psychological literature was once rife with presumptions of normative rationality; (2) bias research illustrates the bounded rationality viewpoint; and (3) people would not violate rational rules if they knew the correct rule and were relieved of the need for computation. Although these assumptions are shared by many, none stands up to scrutiny.
Article
"Five groups of 48 rats each were rewarded on the two sides of a choice situation different proportions of times… . A training of 24 trials failed to establish a discriminatory response in only one of the groups, for which the chances of reward on the two sides were 2/3 against 1/3 (group '67: 33'). In contrast to that, groups '100: 50,' '75: 25' and '50: 0' were significantly above the threshold of probability in an increasing order. For the last of these groups, 50: 0, the difference with control group '100: 0' which represented the traditional unambiguous type of training dropped below significance ('threshold of certainty'). Discrimination increases with the difference of the probabilities of success on the two sides, a further influence being superimposed due to the ratio of probabilities… . Additional experiments of the type 75: 25 and 100: 50 introducing special punishment for each non-rewarded choice… showed that the increase in discrimination goes with the increase of the ratio of the probability of 'emphasis' as given by punishment or by success, and not with the increase in the ratio of the probability of success, per se." (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Explains the influence of the overconfidence phenomenon (P. Juslin, in press) and the hard–easy effect (S. Lichtenstein and B. Fischhoff, 1977) on an ecological approach to realism of confidence. In a replication of Juslin's findings, 20 undergraduates participated in a calibration study, and 20 undergraduates performed familiarity ratings of 240 almanac items to measure both their confidence and general knowledge, respectively. Results show that when objects of judgment are selected randomly from a natural environment, people are well-calibrated. When more and less difficult items are created by selecting items with more and less familiar contents, no hard–easy effect is observed and people are well-calibrated for hard and easy items. It is concluded that the hard–easy effect observed by Lichtenstein and Fischhoff is replicated by these results. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Describes the early history of the water-jar experiments and their variations, and the role that M. Wertheimer played in the history and in the formulation of preliminary working hypotheses. The water-jar experiments are presented as a case study to illustrate the fate of an investigation. Summaries are presented of surveys of textbook descriptions and explanations through the decades of Einstellung effects in learning by repetition, using mainly the water-jar problems. (0 ref) (German abstract) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
The paper investigates, using studies of attention as exemplars, the modern truism that cognition was almost completely suppressed by Watsonian behaviourism and its offspring during the first half of this century. This is done by, first, examining the abstracting journals for the period 1910–1960, then by looking at relevant reviews over this period and, finally, by seeing if it is possible to detect conceptual links between old and new work on attention. The picture thus revealed provides no support for the triumph of behaviourism. The paper concludes by briefly discussing the source of cognitive psychologists' attitudes to the history of their subject.
Article
An inferential task was investigated in which the subjects had to select which of four cards they needed to inspect in order to determine whether a rule was true or false. In one condition crucial information was concealed on the other side of the cards, and in another condition it was on the same side of the cards, but covered by a mask. A previous experiment suggested that subjects sometimes confused the notion of ‘the other side of the card’. But no difference was found between these two conditions. Only two out of the 36 subjects initially made the correct selection. An attempt was made subsequently to enable the subjects to correct their errors by asking them to evaluate the cards in relation to the rule. When a conflict occurred between the selection of the cards and their evaluation, some insight was gained. In other cases these two processes passed one another by, in spite of the fact that this involved self-contradiction.
Article
Where do new ideas come from? What is social intelligence? Why do social scientists perform mindless statistical rituals? This vital book is about rethinking rationality as adaptive thinking: to understand how minds cope with their environments, both ecological and social. The author proposes and illustrates a bold new research program that investigates the psychology of rationality, introducing the concepts of ecological, bounded, and social rationality. His path-breaking collection takes research on thinking, social intelligence, creativity, and decision-making out of an ethereal world where the laws of logic and probability reign, and places it into our real world of human behavior and interaction. This book is accessibly written for general readers with an interest in psychology, cognitive science, economics, sociology, philosophy, artificial intelligence, and animal behavior. It also teaches a practical audience, such as physicians, AIDS counselors, and experts in criminal law, how to understand and communicate uncertainties and risks.
Article
1980 marks the 10th anniversary of the National Conference on the Use of On-Line Computers in Psychology. There have been dramatic changes in the applications of computers to psychology during that time, with even more dramatic developments in theory and methodology. Current advances in distributed processing and the rapid dissemination of inexpensive microprocessors portends even greater change over the next decade.
Article
To date, attempts to teach Bayesian inference to nonexperts have not met with much success. BasicBayes, the computerized tutor presented here, is an attempt to change this state of affairs. BasicBayes is based on a novel theoretical framework about Bayesian reasoning recently introduced by Gigerenzer and Hoffrage (1995). This framework focuses on the connection between “cognitive algorithms” and “information formats.” BasicBayes teaches people how to translate Bayesian text problems into frequency formats, which have been shown to entail computationally simpler cognitive algorithms than those entailed by probability formats. The components and mode of functioning of BasicBayes are described in detail. Empirical evidence demonstrates the effectiveness of BasicBayes in teaching people simple Bayesian inference. Because of its flexible system architecture, BasicBayes can also be used as a research tool.
Article
Computers and technology in psychology can be a cornucopia or a Pandora’s box. During the 20 years of its existence, the Society for Computers in Psychology has been an important focus for the appropriate and beneficial application of computing technology in psychology. Although the increase of computer use is unmistakable, cyclic trends in computer applications also can be identified and, together with current technological developments, lead to predictions, concerns, and challenges for the future.