The Logic of Scientific Discovery
Abstract
Described by the philosopher A.J. Ayer as a work of ‘great originality and power’, this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine of ‘falsificationism’ electrified the scientific community, influencing even working scientists, as well as post-war philosophy. This astonishing work ranks alongside The Open Society and Its Enemies as one of Popper’s most enduring books and contains insights and arguments that demand to be read to this day. © 1959, 1968, 1972, 1980 Karl Popper and 1999, 2002 The Estate of Karl Popper. All rights reserved.
... Essa combinação de teorias, alinhada ao paradigma neoperspectivista, permitiu um entendimento aprofundado das complexidades e nuances do uso das TDICs para monitoramento mental, respeitando as particularidades do público idoso 31,32 . de revisão iniciou-se com a definição dos critérios de inclusão e exclusão, estabelecendo-se como critérios de inclusão artigos revisados por pares, publicados nos últimos cinco anos, e relacionados ao uso de tecnologias digitais para saúde mental de idosos 24,41 . Os critérios de exclusão incluíram estudos que não apresentavam rigor metodológico adequado ou que não abordavam a saúde mental de forma específica. ...
... A pesquisa foi realizada nas bases de dados Scopus, PubMed e Web of Science, utilizando descritores como "saúde mental de idosos", "inteligência artificial na geriatria", "monitoramento digital", e "autonomia em idosos". Inicialmente, foram encontrados 412 artigos, que após a aplicação dos critérios de inclusão e exclusão, resultaram em uma amostra final de 68 estudos, que constituíram o corpus de análise para esta revisão [29][30][31][32][33][34][35][36][37][38][39][40][41][42][43][44] . A escolha de uma revisão narrativa foi estratégica, pois permitiu uma abordagem mais ampla e interpretativa das evidências disponíveis, possibilitando uma análise crítica e contextual das diferentes dimensões do uso das TDICs na saúde mental dos idosos 25,[33][34][35] . ...
... O uso de tecnologias como a RV proporciona aos idosos experiências imersivas e terapêuticas, estimulando a cognição e reduzindo os níveis de solidão, um fator crítico na saúde mental dessa população[29][30][31] . Ao simular ambientes familiares ou proporcionar novas experiências, a RV também auxilia na diminuição dos sintomas de distúrbios emocionais[39][40][41] .Outro aspecto importante é como essas tecnologias são adaptadas para o atendimento das necessidades específicas dos idosos, um público com limitações cognitivas e físicas que exigem interfaces intuitivas e amigáveis. Observa-se que a adaptação de tecnologias para idosos envolve o desenvolvimento de sistemas que são sensíveis às limitações comuns da idade, como a redução da mobilidade e os declínios na visão e na audição 2,12-14 . ...
Introdução: esta pesquisa explora o impacto das Tecnologias Digitais de Informação e Comunicação (TDICs), como Inteligência Artificial (IA) e Realidade Virtual (RV), no monitoramento e promoção da saúde mental de idosos. Contextualiza-se na crescente importância de tecnologias para o cuidado com a população idosa, considerando o aumento da expectativa de vida e a prevalência de transtornos mentais nessa faixa etária. A problemática está na falta de compreensão sobre a eficácia dessas tecnologias para a saúde mental, especialmente em promover autonomia e bem-estar emocional. Objetivo: compreender e analisar como as TDICs contribuem para a prevenção e a promoção da saúde mental de idoso, em tempo real, facilitando o acompanhamento contínuo e intervenções personalizadas. Metodologia: adotou-se o paradigma neoperspectivista giftedeano e o método hipotético-dedutivo, apoiando-se nas teorias do Envelhecimento Ativo, da Autonomia Idosa e da Adesão Tecnológica. Conduziu-se uma Revisão Bibliográfica e Documental Narrativa nas bases Scopus, PubMed e Web of Science, resultando em 68 estudos analisados. Resultados: os achados indicam que a IA e a RV podem melhorar o bem-estar dos idosos, especialmente com suporte familiar e interfaces amigáveis, embora barreiras econômicas e psicossociais ainda limitem o acesso e a aceitação das tecnologias. As principais limitações incluem a dependência de dados secundários e a falta de estudos longitudinais. Conclusões: a pesquisa contribui para a compreensão teórica e prática das TDICs na geriatria, com valor agregado para a inclusão digital, promoção de políticas públicas e bem-estar na terceira idade.
... Though we, as authors of this paper, subscribe to the idea that the goal of science is to describe what is true, as a human endeavour, science is an inherently social enterprise reliant upon communication between scientists. We do however believe especially that through the methods of transparency, critical discourse, and learning from other researchers, what has been described as intersubjectivity (Popper, 2002) or collective objectivity (Longino, 1990), is an effective way to strive towards truth. Thus, we will in this paper assume the goal at level 2 of "facilitating communication". ...
... We will say that a concept C1 is more "precise" than an alternative concept C2 if the criteria for what counts as falling under the concept, or, equivalently, being the extension of the concept, C1 is less likely to allow for multiple interpretations than it is for C2. This is similar to Popper's use of precision (Popper, 2002). Note that precision is not a criterion for a scientific definition. ...
... One such complication comes from Sally Haslanger(Haslanger, 1999) who argues that we ought to define "knowledge", very much an epistemic concept, so that it helps increase social justice, a non-epistemic goal. 3 Thus the aim of a scientist is to strive towards truth, while still acknowledging that when it comes to empirical matters, we will never know whether "what we believe is true" is true(Popper, 2002). For transparency, we do take this to be our assumed overarching level 1 goal hence utilising it as our example. ...
When defining concepts for scientific work, researchers consider various desiderata (desired characteristics) that are derived from higher level meta-desiderata regarding accepted goals of scientific work and how to act towards achieving them. Although these desiderata are often implicit, they significantly impact researchers' processes of defining and selecting concepts, ultimately influencing research outcomes, communication, and knowledge production. We evaluate the selection and application of desiderata for defining cross-disciplinary concepts dependent upon higher level meta-desiderata working from the example that if one believes that science ought to strive for providing systematic knowledge about the world (level 1), and if one believes that the ease of communication will positively affect the process of attaining that kind of knowledge (level 2), then one ought to strive for definitions of concepts that provide as much as possible of the list of features we argue for (level 3). Using psychological concepts as our examples we offer a proposed list of six desiderata: precision, specificity, theory-neutrality, explanation-neutrality, operational ease, folk-level understanding. The goal is to facilitate researchers' consideration of appropriate desiderata and, where their meta-desiderata match the example provided, promote the adoption and utilization of these desiderata in their work. Given the backdrop of conceptual change and conceptual creep, this aspect, though less discussed, plays a crucial role in shaping our research and knowledge production.
... Le but de cette démarche peut être heuristique (qu'apprend-t-on du fonctionnement du système ?) ou prescriptive (comment agir étant donnée l'information dont on dispose ?). Ainsi, comme l'ont souligné Mouquet et al. (2015) ou Maris et al. (2018), il convient de distinguer nettement les prédictions formulées dans l'une et l'autre démarche. La première formule des « prédictions corroboratives » visant à tester la validité d'une théorie (ou a minima d'une hypothèse), dans le sens où si la prédiction diffère trop largement de l'expérience, la théorie est falsifiée et doit être amendée ou remplacée (Popper, 1959). La seconde, au contraire, formule des prédictions qui n'ont pas forcément vocation à être réalisées puisqu'elles se prononcent sur un avenir qui peut encore être modifié en prenant les bonnes décisions. ...
... The overall skills of the calibrated model must then be thoroughly assessed thanks to a procedure generally referred to as "validation", in order to ensure that the model meets the expectations for which it was developed. The term "validation" can be somewhat misleading and Konikow et Bredehoeft (1992) argued that models viewed as hypothesis cannot be "validated" but only "invalidated" in Popper's (1959) sense. However, as Rykiel (1996) stated, although validation cannot ensure that the model is a correct representation of the system (its conceptual validity or realism), it is still possible to "validate" models for pragmatic purpose by ensuring that its prediction are precise and accurate enough to make it suitable for its intended use (i.e. its operational validity). ...
Although central to fisheries management, the MSY concept has significant limitations when the complexity of the fisheries socio-ecosystem is taken into account. This thesis contributes to the debate on sustainable fisheries management reference points
through a dual modeling approach: (1) an analytical approach using a simplified model of a single population, to highlight the sensitivity of MSY to the species’ life cycle when intra-annual time scales are taken into account; and (2) a complex modeling approach, to best integrate the multi-species and multi-fleet character of real fisheries. It also completed the ISIS-Fish model for the Bay of Biscay demersal fishery. This model was then used to highlight the conflicts generated by the search of MSY in this type of fishery, and to test how these could be mitigated or aggravated by possible management measures. In both approaches, particular attention was paid to how population stability and resilience was impacted by fishing. It is suggested that future management take greater account of this important dimension of fisheries sustainability.
... Tavoitteena on valita aineistoa riittävän hyvin kuvaavista malleista yksinkertaisin ja sa-malla teoreettisesti perusteltavissa oleva malli. Tähän sisältyy myös sisältötulkinta: mitä yksinkertaisempi malli on, sitä todennäköisemmin se sopii kuvaamaan myös muita aineistoja ja auttaa selittämään ilmiöiden välisiä suhteita (Popper, 1992). Tutkimustulosten yleistettävyys useaan aineistoon eli toistettavuus kertoo tulosten reliabiliteetista eli luotettavuudesta. ...
... Mallin testaaminen on täten hypoteettis-deduktiivinen prosessi (Tarka, 2018). Mikäli mallin avulla kyetään osoittamaan teorian implikoimat havaitut vaikutukset, kuten kausaalioletukset (Bollen & Pearl, 2013), teoria, jonka pohjalta tilastollinen malli on laadittu, on kestänyt falsifioimisyrityksen (Popper, 1992). Se saa siten tuekseen yhden lisänäytön pätevyydestään. ...
Kiinnostus rakenneyhtälömalleja kohtaan on kasvanut käyttäytymistieteellisessä tutkimuksessa. Rakenneyhtälömalleja käyttävä tutkija voi laatia teoriaperusteisen tilastollisen mallin, joka yhdistää suorien havaintojemme ulkopuolella olevat teoreettiset käsitteet tutkijan keräämään aineistoon, ja testata, vastaako malli aineistoa. Tällaisen konfirmatorisen analyysin lisäksi aineistoa on mahdollista tutkia myös eksploratiivisesti. Tällöin tutkija voi esimerkiksi mallintaa aineiston heterogeenisyyttä etsimällä aineistossa piileviä ryhmiä. Rakenneyhtälömallinnuksen monipuolisuutta kuvaa se, että konfirmatorisia ja eksploratiivisia analyyseja voidaan tehdä muuttuja- tai henkilökeskeisesti tai niitä yhdistelemällä. Rakenneyhtälömallinnuksen yleistyessä on kasvanut tarve ymmärtää, miten teoreettiset käsitteet yhdistetään malliin ja mitä tieteenfilosofisia oletuksia tähän sisältyy. Tämän artikkelin tarkoituksena on esitellä käsitteellinen malli, jonka avulla käyttäytymistieteen tutkija voi tarkastella mallinnettavan teoreettisen käsitteen (ilmiön), aineiston, tilastollisen mallin ja sisältöteorian välisiä suhteita. Käsitteellisen mallin taustalla on tieteellinen realismi. Sen mukaan todellisuudessa on sekä suoraan havaittavia että mielistämme riippumattomia suoran havainnon saavuttamattomissa olevia ilmiöitä, joita on mahdollista ja mielekästä tutkia. Artikkelin lopuksi esittelemme, miten empiiristä tutkimusta laadittaessa tehdyt tutkimukselliset ratkaisut kytkeytyvät käsitteelliseen malliimme.
... Furthermore, certain points needed to be reworked so that they could be tested and 'falsified' (Bowlby 1969). Bowlby believed that psychoanalysis should adapt itself to become a proper scientific discipline, as postulated by Karl Popper in 'The logic of scientific discovery' 2 (Popper 1935(Popper /1959. Bowlby (1969) also rejected the explanation of ego psychology for the mother's connection with the baby, which would come from feeding and satisfying other needs. ...
... Furthermore, certain points needed to be reworked so that they could be tested and 'falsified' (Bowlby 1969). Bowlby believed that psychoanalysis should adapt itself to become a proper scientific discipline, as postulated by Karl Popper in 'The logic of scientific discovery' 2 (Popper 1935(Popper /1959. Bowlby (1969) also rejected the explanation of ego psychology for the mother's connection with the baby, which would come from feeding and satisfying other needs. ...
Attachment theory, postulated by John Bowlby in collaboration with psychologists Mary Ainsworth and Harry Harlow, has been the subject of much discussion about its nature. The author considers it a psychoanalytical theory, but his peers in psychoanalysis at the time rejected this idea and offered criticism of his concepts, suggesting that they were not in alignment with the principles of psychoanalysis. At the same time, collaborators Mary Ainsworth and Harry Harlow have repeatedly questioned the necessity of Bowlby’s choice of psychoanalysis as a basis for attachment theory, suggesting that it may not be the most appropriate approach. Lately, attachment theory has been used in many psychology courses and articles, without so much as a single mention to its psychoanalytical nature. This article presents a research proposal for an investigation of the nature of attachment theory on a conceptual level. It poses the question of whether the concepts used as a basis for attachment theory are consistent with psychoanalysis.
... Neste estágio, a teoria é amplamente exploratória e está em busca de validação empírica robusta. Popper (1934) acrescenta que a falseabilidade, enquanto critério central, tornase mais concreta à medida que hipóteses testáveis são formuladas e submetidas ao escrutínio empírico. Lakatos (1970) destaca que o amadurecimento de uma teoria requer um programa de pesquisa progressivo, com hipóteses auxiliares que sustentem o núcleo teórico, permitindo refinamentos contínuos. ...
... orientadas para a investigação empírica, validando a integração de paradigmas teóricos diversos dentro de um framework epistemológico coerente.2.2 EIXO/PILAR LÓGICONo eixo lógico, o método hipotético-dedutivo foi empregado em todas as etapas da pesquisa, estruturando um processo científico sistemático e rigoroso. Inicialmente, hipóteses gerais e específicas foram formuladas a partir da revisão teórica, contemplando conceitos como previsibilidade e imutabilidade(Gifted, 2015;2016;Breviário, 2021; 2022a; 2022b; 2023a;2023b; Breviário el al., 2024a;2024c;2024d;2024e; 2024g; 2024h; 2024i;Oliveira, 2024;Pereira, 2021).Estas hipóteses foram progressivamente refinadas com base em dados empíricos coletados por meio de questionários validados e modelos preditivos, conforme descrito porPopper (1934). Na etapa seguinte, deduções foram realizadas para prever resultados observáveis que poderiam corroborar ou refutar cada hipótese. ...
Esta pesquisa explora a Teoria da Predestinação, com foco nos conceitos de previsibilidade e imutabilidade da vida humana e suas implicações nos comportamentos sociais e processos cognitivos. Contextualizada nas interseções entre teologia, psicologia e ciências sociais, a problemática investigada reside na ausência de validação empírica robusta para os pressupostos teóricos dessa teoria. O objetivo principal estabelecer um protocolo científico, inspirado nas diretrizes de Chibeni e Moreira-Almeida para a exploração do desconhecido, a fim de validar empiricamente a Teoria da Predestinação, incluindo seus conceitos centrais de previsibilidade e imutabilidade da vida humana, e explorar suas implicações nos comportamentos sociais e processos cognitivos. Adotando o paradigma neoperspectivista giftedeano, a pesquisa integrou teorias como as de Kuhn, Lakatos e Hempel, além de abordagens psicoterapêuticas inovadoras. O método hipotético-dedutivo orientou a formulação e testagem de hipóteses, enquanto uma Revisão Bibliográfica e Documental Narrativa mapeou contribuições relevantes em bases de dados como Scopus, PubMed e Web of Science. Foram analisados 93 trabalhos, abrangendo literatura interdisciplinar e estudos empíricos relacionados. Os principais achados incluem a identificação de padrões comportamentais previsíveis e insights sobre como memórias passadas e projeções futuras moldam processos cognitivos. Contudo, lacunas como a ausência de replicação em larga escala e validação empírica definitiva foram identificadas. Limitações metodológicas e teóricas restringem a generalidade dos resultados, mas a pesquisa contribuiu significativamente ao propor protocolos inovadores e gerar novas perspectivas teóricas e empíricas. O valor agregado abrange o avanço da compreensão sobre comportamento humano, impacto em áreas interdisciplinares e relevância para debates filosóficos e sociais.
... The outcome of our study provides some form of empirical evidence against the use of inductive reasoning in general and against its application during clinical evidence appraisal in particular. It appears to support the logical stance that inference from a limited number of observations, no matter how numerous, to universal (generalized) statements cannot be justified, because any conclusions drawn in this way may always turn out false [19]. ...
... However, even without considering its lack of logical soundness, any truth probability would technically require a ratio of the number of supporting observations to a reference sequence of the total number of observation not yet made. Such reference sequence appears unknowable or is at least difficult to determine [19] and, as our study shows, seems vulnerable to gross underestimation. This is further relevant in regard to the possible objection that errors should not be considered equal in their relevance and likelihood for clinical trials. ...
Background: To establish the possible likelihood of a body of evidence, inductively judged to be of ‘low bias risk’/‘highquality’ according to a limited set of appraisal criteria, of actually being error-free.
Methods: A total of 45 simulation trials were generated and randomly assigned to 0-5 errors out of a total of 65 error domains. The trials were then appraised for errors with a simulated appraisal tool consisting of five prespecified
error domains. Trial appraisal yielded either true positive, true negative, false negative or false positive results. From these values, the negative likelihood ratio (-LR) with 95% confidence interval (CI) was computed. -LR computation was repeated 25 times, each with newly generated random values for all 45
trials. The individual results of all 25 runs were statistically pooled. The pooled -LR result with 95% CI was interpreted as how likely a ‘low bias risk’/‘high-quality’ rated body of evidence is actually error-free.
Results: The pooled -LR was 0.84 (95% CI: 0.80 - 0.88, I2 = 0.0%). The result suggests that error-free evidence is only 1.2 times more likely to be rated as ‘low bias risk’/‘high-quality’ than evidence containing some form of error.
Conclusions: The likelihood of a ‘low bias risk’/‘high-quality’ rated body of evidence being actually error-free is small and the inductive generalization from any limited, pre-specified set of appraisal criteria rarely justifies a high level of confidence that a ‘low bias risk’/‘high-quality’ rating of clinical evidence reflects the true effect of a certain treatment without being affected by error.
... Generating a causal hypothesis is a fundamental step in the scientific endeavour [119]. It involves proposing potential explanations of why certain things happen. ...
... The development of plausible hypotheses to be tested via experiments, was, and still is, responsibility of the researcher. To conceive hypotheses, tools such as inductive (from specific observations to general conclusions) and abductive (inferring the most likely cause) reasoning [119] have traditionally been used. Researchers also imagine causal models to make sense of observational data, especially for unexpected findings [46]. ...
Towards the goal of understanding the causal structure underlying complex systems—such as the Earth, the climate, or the brain—integrating Large language models (LLMs) with data-driven and domain-expertise-driven approaches has the potential to become a game-changer, especially in data and expertise-limited scenarios. Debates persist around LLMs’ causal reasoning capacities. However, rather than engaging in philosophical debates, we propose integrating LLMs into a scientific framework for causal hypothesis generation alongside expert knowledge and data. Our goals include formalizing LLMs as probabilistic imperfect experts, developing adaptive methods for causal hypothesis generation, and establishing universal benchmarks for comprehensive comparisons. Specifically, we introduce a spectrum of integration methods for experts, LLMs, and data-driven approaches. We review existing approaches for causal hypothesis generation and classify them within this spectrum. As an example, our hybrid (LLM + data) causal discovery algorithm illustrates ways for deeper integration. Characterizing imperfect experts along dimensions such as (1) reliability, (2) consistency, (3) uncertainty, and (4) content vs. reasoning are emphasized for developing adaptable methods. Lastly, we stress the importance of model-agnostic benchmarks.
... Lastly, the iterative or tentative nature aligns well with the inherent process of scientific discovery 49 . In the AI domain, the iterative approach to model development is often viewed as a strength because it enables starting with a modest proposal, gathering feedback and additional data, and then refining the model based on those inputs, continuing this constructive loop of enhancements. ...
Two decades of research on visual working memory have produced substantial yet fragmented knowledge. This study aims to integrate these findings into a cohesive framework. Drawing on a large-scale behavioral experiment involving 40 million responses to 10,000 color patterns, a quasi-comprehensive exploration model of visual working memory, termed QCE-VWM, is developed. Despite its significantly reduced complexity (57 parameters versus 30,796), QCE-VWM outperforms neural networks in data fitting. The model provides an integrative framework for understanding human visual working memory, incorporating a dozen mechanisms—some directly adopted from previous studies, some modified, and others newly identified. This work underscores the value of large-scale behavioral experiments in advancing comprehensive models of cognitive mechanisms.
... Modern science relies on trust, that we partially establish by experimental evidence. Scientists often propose logically coherent hypotheses, then verify them experimentally (Popper, 2005). Next to theoretically proven discovery. ...
Experimental verification and falsification of scholarly work are part of the scientific method's core. To improve the Machine Learning (ML)-communities' ability to verify results from prior work, we argue for more robust software engineering. We estimate the adoption of common engineering best practices by examining repository links from all recently accepted International Conference on Machine Learning (ICML), International Conference on Learning Representations (ICLR) and Neural Information Processing Systems (NeurIPS) papers as well as ICML papers over time. Based on the results, we recommend how we, as a community, can improve reproducibility in ML-research.
... The rejection threshold is therefore set by whether the evidence can persist through replication and not by how easily it slots seamlessly into an already existing theory. Accordingly, Popper's theory of falsifiability [29] proposes that every theory is only one good piece of unfitting evidence away from revision, or potentially rejection. Perhaps the more important question then is does this data meet the quality and rigorous requirements to warrant expanding electrodynamic theory, for example, or engaging with some other set of burgeoning and plausible ideas? ...
In an earlier study, inductive pulse charging (IPC), using solenoid generated high voltage transients, also known as flyback or kickback pulses, have been shown to induce energy gains in Lead acid and LiFePO 4 batteries, when using specific operational parameters, but with no clear indication as to the source of the additional energy. While there are presently no widely accepted theories or models regarding the energetic pathways and processes involved, it is proposed that there are only two viable possibilities for the source of the observed energy gains, as distinct from the actual mechanisms involved. The energy influx either derives from an internal response of the electrochemistry to high voltage electrostatic pulses, whereby enthalpic energy is released from the electrochemistry and serves as a form of 'fuel', or the energy influx derives from the local environment by as yet unrecognised processes and pathways. Here the battery is considered to function as part of a thermodynamically open system in the presence of 'far from equilibrium' events, such as those triggered by high voltage pulses. This follow-up study, undertaken again within the Open Science Framework (OSF), sets out to test the proposed hypothesis, that internal enthalpy is the source of any pulse-induced energy influx, by looking at evidence from three main areas. Firstly, the effect of pulses on capacitors, they being devoid of any functional electrochemistry, secondly, through thermodynamic analysis and bench testing of battery capacities in conjunction with a 'chemical deficit model', and thirdly, by looking at records of battery pulse and cyclic histories to identify any long-term effects on capacity. The results, in particular the correlation between predicted and measured battery capacities with both cell chemistries, together with their pulse histories, have clearly shown that the null hypothesis of an enthalpy source must be rejected in favour of the alternative, an external source and where the battery and the local environment comprise a thermodynamically open system. Consideration is also given to the possible implications of these findings for classical and quantum electrodynamic theory and how the integration of 'non-linear' and 'far from equilibrium' states might be seen as further evidence of the need for an extended and more complete model that includes interaction with the environment and otherwise anomalous phenomena.
... At first glance, this synchronous occurrence might appear to be more than mere coincidence, perhaps even suggesting telepathic communication. Yet the research method teaches us to consider the counterfactual: all those times you thought of your friend and the phone did not ring, a control condition indispensable for debunking the illusory cause-and-effect relationship (Mill, 1856;Popper, 2005). Sometimes the counterfactual scenarios necessary to test our hypotheses are not available. ...
Establishing a causal relationship requires not only the presence of a factor of interest but also the demonstration that the relationship is absent when the factor is absent. Such ideal conditions are rare, especially in observational studies in which creating control conditions is inherently difficult. The COVID-19 pandemic, with its unparalleled disruptions, offers a unique opportunity to isolate causal effects and test previously impossible hypotheses. Here, we examine the home advantage (HA) in sports—a phenomenon in which teams generally perform better in front of their home fans—and how the pandemic-induced absence of fans offered a fortunate yet systematic change to typical conditions, serving as a natural experiment. Using a structural equation modeling approach and building a mediation model encompassing all relevant HA factors, we quantified the reduction in HA and elucidated the specific mechanisms behind it. The theory behind HA and the availability of measures for each factor before and during COVID-19 lockdowns enabled us to estimate all postulated pathways within a natural experimental context. The robust statistical framework used in our study offers a foundational model for integrating naturally occurring events that serve as control conditions into the analysis of various real-life phenomena.
... Vastaan on tullut kysymyksiä induktiivisesta päättelystä ja popperilaisesta falsifikationismista (Popper, 1959). Olin aikaisemmin ihmetellyt falsifikationismin ja tilastollisen päättelyn välistä ristiriitaa eli sitä, miksi psykologiassa ei pyritä tilastollisissa testeissä kumoamaan falsifikationismin mukaisesti varsinaisia mallien ennustuksia vaan niille vastakkaisia hypoteeseja (Holtz & Monnerjahn, 2017;Meehl, 1967Meehl, , 1978. ...
... Philosophers of science have long argued that social science is an altogether different beast from the natural sciences and requires different methods of study (e.g., Dilthey, 1989;Gadamer, 2006;Habermas, 1988). Even some empiricists, such as von Hayek (1943Hayek ( , 1989 and Popper (1959Popper ( , 1965, attacked the apparent "scientism"-the cosmetic misuse of scientific methods to feign legitimacy-of using natural scientific methodologies within the social sciences. Social science's envy of the natural sciences' success (Bygrave, 1989;Mirowski, 1992), critics argue, has misled it to adopt methodologies that are foundationally misaligned. ...
The replication crisis has cast social science’s epistemological foundations into question. So far, entrepreneurship scholars have responded by advocating more transparency in data collection and analysis, better empirical methods, and larger and more representative data. Here, we explore the possibility that the problem may be innate to empiricism itself within the social sciences, generally, and entrepreneurship theory, specifically. We review classical arguments and introduce new ones about how and why the weaknesses of empiricism—such as challenges of unobservability—are exacerbated in the study of human behavior, which weaknesses manifest centrally in entrepreneurship theory. These arguments suggest that social science as principally an empirical endeavor may be foolhardy, particularly in the highly agentic entrepreneurship discipline. Herein we propose a radical solution: a rationalist scientific paradigm, where phenomenological reasoning, rather than observation, is paramount. This proposal rests upon arguments that empiricism’s innate limitations can be overcome, albeit not entirely, by its rationalist counterpart. We can, we argue, develop robust scientific foundations—even laws as valid as those of the natural sciences—for entrepreneurship theory through a formal rationalist approach. These laws would necessarily be few but would serve as a much stronger foundation for entrepreneurship theory than the empirical contingencies that we observe. We conclude by illustrating what such a rationalist management program might look like for entrepreneurship scholars with Bylund’s entrepreneurial theory of the firm.
... The SM, as developed by pioneers like Descartes and Galilei and refined by Newton and Kant, has been the bedrock of modern scientific education (Li 1907). However, in the last century, philosophers of science like Karl Popper and Paul Feyerabend have questioned whether this method is inherent in nature or merely one of many possible approaches to understanding the world (Popper 1959, Feyerabend 1975. A detailed account of the different positions expressed about the limitations and the universality of the SM is beyond the scope of this paper; it is a fact that the Westerns Educational System (WES) at all levels and in almost all disciplines, has been informed by the SM, thus forming and orienting the attitude of entire generations towards the issues addressable with the scientific tools and away from those not directly addressed within its framework (Ravitch 2016). ...
The paper considers the adequacy of the current Western Educational System in preparing Sapiens for an impending phase transition, potentially as significant as the cognitive, agricultural, scientific, and industrial revolutions that have shaped human history. Drawing on Yuval Noah Harari's conceptual equation B × C × D = AHH (Biological knowledge × Computational power × Data availability = Ability to Hack Humans), the paper explores the rapid advancements in biotechnology, artificial intelligence, and data science that are challenging traditional educational paradigms. The research critically analyzes the limitations of the scientific method that underpins modern education, highlighting its tendency towards hyper-specialization, oversimplification, and mutually exclusive logic. The author argues that this approach, while historically successful, may be insufficient to address the complex, interconnected challenges of the 21st century and beyond. To address these limitations, a more holistic educational framework is proposed that integrates Eastern philosophical traditions, meditative practices, and systems thinking alongside rigorous scientific inquiry. This approach aims to cultivate a deeper understanding of the subjective "meaning" in human experience, complementing objective scientific knowledge.
... Despite their differences and complementarities, they unanimously defend the thesis that "science is not simply a method of knowledge or even a body of knowledge, but a sociocultural phenomenon" which undoubtedly includes a "third component." Holton put forward themata as evidence of this "third component", and his thematic analysis first presents itself as an irreconcilable response to the "non-validity" verdict formulated by neo-positivism and Popper (Popper, 2005a;Reichenbach, 1961) regarding the study of discovery by epistemology. ...
This article examines the rich texture of Holton's themata. Holton argues that within established rational norms there is room for subjective elements, including scientific imagination. He posits that these peer influences, known as themeta, not only fail to impede scientific progress, but also serve as a conduit for new scientific discoveries. The paper aims to gain a comprehensive understanding of their impact on academics and scientific research, by investigating their potential convergence or divergence with other cognitive frameworks in the philosophy of science, such as Kant's categories, Kuhn's paradigms and Lakatos' research programmes. By comparing themeta to these well-established frameworks commonly used to unravel scientific knowledge and research, this study aims to clear up potential confusion and deepen our understanding of the often overlooked or underestimated influence exerted by themeta in science. In a specific sense, this investigation highlights the vital role played by imagination and pre-existing thought structures in the formulation and advancement of scientific theories. Through this analysis, a comprehensive understanding of the essence of themata is provided, highlighting the importance of recognising and understanding themata as essential components of scientific research at certain point. As a result, this investigation reinforces the validity of the assertion, an ongoing subject of debate, that empirical data, mathematical expertise or logical reasoning alone cannot supplant these integral constituents.
... The data is gathered systematically, sometimes exhaustively, and is subsequently analysed to address a research question, thereby generating knowledge about the law and its actual functionality. This definition can be refined, particularly around the concepts of data-driven knowledge that is verifiable through observation or experiment (Bernard 1865), reproducible, and even falsifiable in the Popperian sense (Popper 1959). ...
Empirical Legal Research (ELR) can be defined as the systematic collection and analysis of data on legal phenomenon, in a way that is verifiable, reproducible and falsifiable. Europe is a good case to study the development of ELR, notably given the long tradition of unity in legal scholarship. Of course, this tradition is that of doctrinal scholarship but, at the turn of the 20th century, it was partially challenged by the emergence of new social sciences. Nowadays, ELR implicitly challenges the division of labour that resulted from this period, by using social science methods to grasp legal objects. It thus challenges normativist dogma and goes beyond legal realism. If there were any doubt about that, ELR can be considered “legal” research, notably because, at least in some cases, its object is purely legal. However, this does not mean that ELR answers the same questions as doctrinal approaches. Although ELR seems to be developing in Europe, some countries are lagging behind, for example France where interest in ELR has only been growing in recent years. At the European level, signs of growth are more tangible. This raises questions regarding the structuring of this field of research. This special issue aims to answer them.
... This form of knowledge relates to disciplines like engineering, medicine, and agriculture, focusing on empirical, observable phenomena. While temporal knowledge is adaptable and can evolve with new evidence or societal needs, it remains essential for societal well-being and human advancement (Popper, 2005;Sardar, 1985). ...
Islamic civilisation once led the world to develop worldly knowledge, producing significant advancements in medicine, mathematics, and astronomy. Scholars viewed these knowledge domains as complementary to religious knowledge, with the latter holding superior status. However, in recent times, some Muslim communities have distanced themselves from temporal knowledge, wrongly perceiving it as alien to Islam. Insurgent groups like Boko Harām in Northern Nigeria and neighbouring countries, including Niger, Chad, northern Cameroon, and Mali have weaponised this misconception. Boko Harām’s ideology rejects worldly education, equating it with an anti-Islamic agenda, leading to violent campaigns against educational institutions. This article addresses the misunderstanding that beneficial worldly knowledge contradicts Islamic teachings, highlighting its importance as an Islamic obligation. It explores how this misconception has contributed to violence and stagnation, such as in Boko Harām's attacks, and analyses Islamic primary sources, the Qur'ān, the ḥadīth, and scholarly commentaries, to clarify the stance on acquiring such knowledge. The analysis includes an examination of key prophetic traditions that highlight the importance of pursuing worldly knowledge that contributes to societal welfare. The findings reveal that Islām not only permits but encourages the pursuit of beneficial worldly knowledge, which is considered farḍ kifāyah—a collective obligation upon the Muslim community. The article emphasises that education aligned with Islamic values is key to societal progress. It criticises extremist group like Boko Harām for misinterpreting Islam and calls for a return to valuing both religious and beneficial worldly knowledge to advance Muslim communities.
... Племена имели ярко выраженный надобщинный характер, но опирались на родовую традицию и обычное право. Они служили механизмом организации контактов, включающим вопросы товарообмена, разграничения угодий, разрешения споров, обучения, союзных и брачных отношений 50 . Присваивающее хозяйство и столкновения с чужаками, которые рассматриваются в качестве разновидности охоты, требуют от вождя харизмы и определённых физических качеств 51 , В случае сильного пространственного распыления ресурсов 52 его хозяйственные субъекты действовали полуавтономно и сплачивались только при чрезвычайных обстоятельствах. ...
This paper represents the beginning of a discussion about the reasons for the emergence and role of bureaucracy in human society.
... DQ undermines strict and pure accounts of scientific rationality. Most famously, DQ undercuts the logic of scientific justifications presented in Popper's falsificationism (Popper, 1959). According to Popper, a theory is scientific only if it is, in principle, possible to establish its falsity, i.e. it is not immune to possibly discrediting data. ...
This paper explores the Duhem-Quine (DQ) problem and its impact on economic methodology, focusing on how the reliance on auxiliary assumptions complicates the testing and validation of theories. The DQ problem shows that no hypothesis is tested in isolation, as it depends on additional assumptions and background knowledge, making it challenging to pinpoint where errors lie. This issue is particularly relevant in economics, where complex models and assumptions about human behavior play a significant role, and in finance, where the robustness of models is critical for decision-making under uncertainty. The paper highlights two key gaps: (i) the limited discussion of the DQ problem in economic methodology, and (ii) the lack of alternative approaches to ensure rational methods in light of DQ. To address these issues, it proposes a multi-criterial framework for evaluating theories, emphasizing consistency, diverse data, localized testing, comparing models, and varying assumptions systematically. Using examples like housing market models and the Ultimatum Game, the paper illustrates how addressing the DQ problem involves avoiding arbitrary changes to assumptions while adopting clear, rational strategies. By providing a stronger methodological foundation, this approach enhances the reliability of economic and financial theories, improving their influence on policy-making and practical applications.
... More specifically, our approach allows us to reveal for how many participants a media effects hypothesis is confirmed and for how many not. For Karl Popper (1959), the observation of a single case that did not conform to the hypothesis would be enough to falsify a hypothesis. In our work, we argue that a media effects hypothesis is valid only if it applies to the vast majority of participants (i.e., >75%). ...
... First, the clear distinction between theory and experiment according to which theory leads and the experiments follow (cf. Popper, 1959) became blurred. This was not (only) done by an intricate philosophical argument but by analysing actual scientific practice (Hacking, 1983, 149ff.). ...
... For a concrete discussion at the foundational level, we shall focus on scientific inquiry, where science is simply meant to gain knowledge from experience or experiments (ss, e.g., Newton, 1718;Martin and Liu, 2015a). As it is often the case that science inquiry is dynamic (Popper, 2005;Kuhn, 1970), a simple but proper statistical setting can be written as ...
Strong artificial intelligence (AI) is envisioned to possess general cognitive abilities and scientific creativity comparable to human intelligence, encompassing both knowledge acquisition and problem-solving. While remarkable progress has been made in weak AI, the realization of strong AI remains a topic of intense debate and critical examination. In this paper, we explore pivotal innovations in the history of astronomy and physics, focusing on the discovery of Neptune and the concept of scientific revolutions as perceived by philosophers of science. Building on these insights, we introduce a simple theoretical and statistical framework of weak beliefs, termed the Transformational Belief (TB) framework, designed as a foundation for modeling scientific creativity. Through selected illustrative examples in statistical science, we demonstrate the TB framework's potential as a promising foundation for understanding, analyzing, and even fostering creativity -- paving the way toward the development of strong AI. We conclude with reflections on future research directions and potential advancements.
... This distinction can explain how a new theory is formulated. Popper (1959) argued that the deductive approach should be given priority. ...
Theory building is part of the academic endeavor, more emphasized in some contexts than in others. Knowing how to build theory-by either establishing new theoretical approaches or adjusting and developing already known theories-is part of the researcher's competence profile. The concepts of induction and deduction often anchor and justify the theory-building process but cannot always explain how new ideas are created. This chapter discusses the concept of abduction to address commonly envisaged abnormalities in the theory-building process. Abduction is best conceptualized as making guesses. Continually, in a theory-building process, researchers make assumptions when they undertake observations in surprising ways depart from existing theory. Accordingly, abduction is the more profound understanding of theory building. This chapter seeks to explain abduction, going beyond existing frameworks to embrace the systematic combining of theory and the empirical world and arguing that abduction can help better comprehend how theory emerges in specific phases of theory testing, development, and creation. Some argue that the approaches are helpful in both realist and interpretive research and in understanding collaborative research design activities in the business field. 3.1 Introduction Developing new theoretical insight is an ongoing, infinite, and never-trivial scholarly process. This development includes inquiry into the process of theorizing-that is, theorizing about theorizing. This chapter aims to contribute to this meta-endeavor so that scholars may benefit in specific research by recognizing how their newly gained knowledge transforms into theories and why or why not. Scholars want to contribute theories with significant explanatory power. However, because
... Konseptualisasi ideal dari metode ilmiah melibatkan beberapa langkah yang dilakukan secara berurutan dan diulang sesuai kebutuhan untuk memastikan validitas dan reliabilitas hasil penelitian. (Popper, K., 2005). c. ...
Buku ini disusun dengan harapan dapat menjadi sumber belajar
yang komprehensif, mudah dipahami, dan aplikatif. Isi Buku mencakup
konsep dasar epidemiologi, desain penelitian epidemiologi,
pengumpulan dan analisis data epidemiologi, serta penerapan
epidemiologi dalam berbagai konteks kesehatan.
This article provides a new explanation of the judicial decision-making process using the cognitive dissonance theory. It shows how the process of interpreting and applying the law is affected by the natural human need for consistency between what a person knows, believes, and does. Different authors suggest that judges' decisions are influenced by various factors, including law, personal morality, or rational self-interest. The article argues that none of these visions fully describe the judicial decision-making process and proposes a new approach based on the cognitive dissonance theory. Law, personal morality, and rational self-interest are cognitive elements that influence judicial decisions altogether. However, they are often in conflict and cause cognitive dissonance. Judges lean toward the decision that causes the cognitive dissonance that is the easiest to reduce, considering the cognitive elements’ resistance to change. In the process of interpretation, judges strive to reduce the cognitive dissonance that occurs due to their decisions. The reduction of cognitive dissonance is presented in this article as a “happy ending” of the judicial decision-making process.
Ovaj članak postavlja definiciju bića kao određenost, što uključuje i nekolicinu sinonimnih pojmova. Kroz članak se pokazuje kako ta definicija odolijeva napadima analitičke tradicije (verifikabilnost i falsifikabilnost definicije bića), kako je ta definicija primordijalna (ne može se izgraditi na nekim drugim definicijama), kako je ta definicija smislena (upravo zbog postojanja spomenutog svojstva, usuprot ozbiljnim napadima na smislenost pojma bića i na postojanje bića uopće kroz povijest filozofije) i, naposljetku, kako ta definicija zahvaća sve postojeće (bez transcendentnog 'ostatka').
Dobzhansky argues in a specific way in favour of the theory of evolution, using an argumentative scheme that, in his view, allows one to demonstrate the superiority of naturalistic explanations over anti-naturalistic ones. However, using such a research tool as epistemic frameworks it is relatively easy to show that the same scheme can be employed to demonstrate the superiority of anti-naturalistic explanations over naturalistic ones.
Reasoning from inconclusive evidence, or 'induction', is central to science and any applications we make of it. For that reason alone it demands the attention of philosophers of science. This element explores the prospects of using probability theory to provide an inductive logic: a framework for representing evidential support. Constraints on the ideal evaluation of hypotheses suggest that the overall standing of a hypothesis is represented by its probability in light of the total evidence, and incremental support, or confirmation, indicated by the hypothesis having a higher probability conditional on some evidence than it does unconditionally. This proposal is shown to have the capacity to reconstruct many canons of the scientific method and inductive inference. Along the way, significant objections are discussed, such as the challenge of inductive scepticism, and the objection that the probabilistic approach makes evidential support arbitrary.
There have been significant advances in the science of meaning in life (MIL). Researchers have made empirical predictions about the antecedents and consequences of meaning and the best ways it can be enhanced. Yet, it is important that researchers in this area consider the auxiliary assumptions associated with their predictions. Auxiliary assumptions, which traverse the distance from nonobservational theoretical terms to observational terms at the level of the empirical hypotheses, have important implications for the appraisal of empirical victories and defeats. In this paper, we outline the importance of auxiliary assumptions in MIL research. To ensure the validity of findings associated with MIL, we hope this paper encourages researchers to pay close attention to the auxiliary assumptions associated with their predictions. ARTICLE HISTORY
Досліджено роль штучних нейронних мереж як ключового інструменту штучного інтелекту, що викликає інтерес не лише з технічного, а й з філософського погляду. Проаналізовано еволюцію понять "природне" та "штучне" у філософії від Аристотеля до сучасності, а також взаємозв'язок між людським розумом та створеними ним штучними системами. Розглянуто питання мислення та самосвідомості щодо того, чи можуть нейронні мережі імітувати інтуїтивне та творче мислення людини. Установлено, що межа між природним і штучним стає все більш розмитою, що вимагає нового філософського осмислення ролі штучного інтелекту у сучасному світі.
Purpose
Citations can be used in evaluative bibliometrics to measure the impact of papers. However, citation analysis can be extended by a multi-dimensional perspective on citation impact which is intended to receive more specific information about the kind of received impact.
Design/methodology/approach
Bornmann, Wray, and Haunschild (2019) introduced citation concept analysis (CCA) for capturing the importance and usefulness certain concepts have in subsequent research. The method is based on the analysis of citances – the contexts of citations in citing papers. This study applies the method by investigating the impact of various concepts introduced in the oeuvre of the world-leading French sociologist Pierre Bourdieu.
Findings
We found that the most cited concepts are ‘social capital’ (with about 34% of the citances in the citing papers), ‘cultural capital’, and ‘habitus’ (both with about 24%). On the other hand, the concepts ‘doxa’ and ‘reflexivity’ score only about 1% each.
Research limitations
The formulation of search terms for identifying the concepts in the data and the citation context coverage are the most important limitations of the study.
Practical implications
The results of this explorative study reflect the historical development of Bourdieu’s thought and its interface with different fields of study.
Originality/value
The study demonstrates the high explanatory power of the CCA method.
The debate surrounding the topic of Artificial Intelligence (ai), and its different meanings, seems to be ever-growing. This paper aims to deconstruct the seemingly problematic nature of the ai debate, revealing layers of ambiguity and misperceptions that contribute to a pseudo-problematic narrative. Through a review of existing literature, ethical frameworks, and public discourse, this essay identifies key areas where misconceptions, hyperbole, and exaggerated fears have overshadowed the genuine concerns associated with ai development and deployment. To identify these issues I propose three general criteria that are based on Popper’s and Ayer’s work and adjusted to my needs. The subsequent sections categorize ai issues into ontological, methodological, and logical-grammatical problems, aligning with Cackowski’s typology. In addition, I introduce «» signs to distinguish behavioural descriptions from cognitive states, aiming to maintain clarity between external evidence and internal agent states. My conclusion is quite simple: the ai debate should be thoroughly revised, and we, as scholars, should define the concepts that lie at the bottom of ai by creating a universal terminology and agreeing upon it. This will give us the opportunity to conduct our debates reasonably and understandably for both scholars and the popular public.
This chapter explores paradoxes in hypnotic experience, emphasizing that, by its own characteristics, trance is a paradoxical experience, being multiple and unified. In this sense, De-sign is a form to lead people to persevere through paradoxes, in order to help them to a teleological process. The chapter approaches the relationships between hypnosis and De-sign through two main axes. On the one hand, the polarity of sanity vs madness highlights how cultural technologies can sometimes propose significant alternatives for it. The case of the Hindu Guru Ramakrishna, who was considered mad for many years, and Erickson’s schizophrenic patient will illustrate this axe. On the other hand, the chapter discusses the polarity of determinism vs. freedom or sleepwalking vs. being awake, mainly considering the affirmations in which hypnosis doesn’t respect individual freedom. This axe is illustrated by a case study of a woman who apparently needed to be commanded in order to develop her hypnotherapeutic process. This case study emphasizes that the use of amnesia can also be a form to promote individual choices during the therapeutic process.
Suppose your evil sibling travels back in time, intending to lethally poison your grandfather during his infancy. Determined to save grandpa, you grab two antidotes and follow your sibling through the wormhole. Under normal circumstances, each antidote has a 50% chance of curing a poisoning. Upon finding young grandpa, poisoned, you administer the first antidote. Alas, it has no effect. The second antidote is your last hope. You administer it—and success: the paleness vanishes from grandpa's face, he is healed. As you administered the first antidote, what was the chance that it would be effective? This essay offers a systematic account of this case, and others like it. The central question is this: Given a certain time travel structure, what are the chances? In particular, I'll develop a theory about the connection between these chances and the chances in ordinary, time‐travel‐free contexts. Central to the account is a Markov condition involving the boundaries of spacetime regions.
This paper puts forward a new formulation of the experimentalist challenge to the method of cases. Unlike previous attempts to articulate the challenge, the one proposed is based on a clear characterization of the targeted philosophical methodology. The method of cases is explicated as a form of thought experimentation aimed at testing philosophical hypotheses with a distinctive modal force. Given this explication, the empirical evidence gathered by experimental philosophers concerning the instability of case judgments is shown to constitute a stumbling block for the validation of the method as a rational form of hypothesis testing. The experimental challenge is presented as the challenge to compellingly indicate how this stumbling block can be removed, and the method of cases validated as a form of thought experimentation suitable for rationally testing philosophical hypotheses. The challenge, as presented in the text, is shown to be free from all the main criticisms leveled at previous versions of it and as one not yet met.
This study maps the evolution of research themes on datafication, analyzing trends, key authors, interdisciplinary collaborations, and emerging topics from 1994 to 2023. The analysis reveals a notable increase in publication volume, particularly from 2014 onwards, reflecting advancements in digital technologies and heightened interest in data-driven research. A significant surge occurred during the COVID-19 pandemic, with 26.10% of total publications in 2022 and 30.52% in 2023 alone. Thematic clusters identified through keyword mapping include Social Media and Privacy, Artificial Intelligence and Machine Learning, Human Dimensions, and Infrastructure and Trust, highlighting diverse research foci. Emerging discussions on data justice and inequality reflect growing attention to the ethical and socio-political implications of datafication. The study also examines the types of documents and subject areas, revealing the dominance of peer-reviewed journal articles (71.41%) and a strong representation of social sciences (46.93%), computer science (14.75%), and arts and humanities (11.57%). Interdisciplinary connections underscore the broad impact of datafication across technology, healthcare, education, and media studies. This research offers insights into the dynamic nature of datafication, pointing to the need for further interdisciplinary collaboration, especially in addressing societal and ethical concerns such as data governance and digital inequality. Future research directions should focus on the human dimensions of datafication, data literacy, and the development of robust data governance frameworks to mitigate potential inequalities and power imbalances in a rapidly data-driven world.
Hubert Schmidbaur has significantly influenced the field of gold chemistry. His work on preparing various aurocyclic digold compounds and studying their structures and reactivities has laid the foundation for unique applications in photophysics and homogeneous catalysis. The naming and characterization, both experimental and theoretical, of the aurophilicity phenomenon have led to numerous interdisciplinary applications. The emergence and development of dynamic gold chemistry in the excited state, exemplify this impact. Preparative methodologies, characterization techniques, and qualitative bonding theories have been tested through the rational preparation of ligated, element-centred gold clusters. The potential of this fascinating class of compounds remains largely untapped.
The article examines the concept of utopia in its post-Marxist context. Since the 1970s—against the backdrop of the failures of May 68, the self-exposures of the USSR, and the decline of the workers’ movement, as well as in accordance with the immanent history of the logic of the history of philosophy itself—the concept of utopia has been running through new areas of meaning and is extremely dialectical in two modes: temporal and ontological. The first transforms utopia from never-being into “past”, the second provides two inversions, considering it as 1) a dystopia, the other of utopia, which is declared to be the hidden truth of utopia; and — when it is fundamentally possible according to its own concept — 2) as impossible, in connection with which utopia and its concept return to the discourse as a kind of empty place around which modern pessimism circles, correctly believing that the future is unimaginable. The time of ends, from the end of the grand narratives of disappointed radicals (Lyotard) to the end of politics (see Rancière’s analysis), is, however, picked up by Marcuse, who suggests considering utopia as ahistorical. The author introduces this strange ahistorical or even anti-historicalism as historical, relying on the conceptualized phenomenon of the desynchronization of the “base” (the development of productive forces to the degree necessary for social revolution) and the “superstructure”, which runs into a limit, since it cannot represent the restrained base, which has broken out of the formational “scientific” logic. When Marcuse writes that a utopia in the strict sense can now be called a project that violates the laws of nature, he means the “impossible” into which utopia turns after the catastrophes of the 20th century, betraying the truth of its concept contained in the simple possibility of another world. More than 50 years after “The End of Utopia” and almost 30 years after the ontological turn in philosophy, we can say that utopia is still unimaginable — in the strict sense is what violates the laws of logic. This thesis opens up the possibilities of a new dialectic and its alliance with transcendentalism, which the author considers as a critique of plastic reason in the spirit of Malabou, constructing time and time again the assumptions-concepts that it needs and which are “practically necessary” according to Kant.
Introduction. Responsible Research and Innovation (RRI) is increasingly crucial for addressing societal challenges and promoting sustainable economic growth. While RRI principles have been institutionalized in Europe, gender equality (GE) within this framework remains underexplored. GE policy in the European Union (EU), rooted in gender mainstreaming since 1999, addresses socially constructed roles shaped by intersecting factors like race and class. Recognized as a human right and vital for sustainable development, GE enhances participation, eliminates barriers, and integrates gender perspectives into research. This study examines the integration of GE within the RRI framework to address this gap. Method. A two level bibliometric analysis was conducted using Scopus and Web of Science (WoS) databases, focusing on English-language, open-access articles published between 1985 and 2024. Following the application of exclusion criteria, a total of 2134 documents were analyzed, comprising 2045 in Phase 1 and 89 in Phase 2. Analysis. The analysis revealed a significant underrepresentation of GE within the RRI discourse. Despite a growing interest in RRI, there is a lack of meta-analytical studies focusing on GE, with research predominantly addressing broader aspects of RRI. Results. From the 2134 documents analyzed, only 89 explicitly addressed GE within the RRI context. Co-occurrence networks identified four primary RRI clusters emphasizing sustainability, governance, education and ethics. GE related keywords formed smaller, peripheral clusters, indicating marginal representation. The findings underscore a lack of systemic integration of GE into the broader RRI framework. Conclusions. The study highlights the critical need to prioritize GE within the RRI agenda by embedding it across all its dimensions. Addressing this gap will enhance the inclusivity, societal relevance, and ethical alignment of RRI initiatives. Policymakers and institutional leaders must champion GE as a foundational element of RRI to advance sustainable and equitable innovation.
El efecto Dunning-Kruger es un efecto reportado a lo largo y ancho de la literatura en contextos académicos y de otra índole. El efecto en sí consiste en cuatro dimensiones: (a) una apreciación errónea sobre la ejecución propia; (b) una calibración errónea en la apreciación de cómo nuestro rendimiento no se adapta a la ejecución comparada con el resto de la población; (c) una apreciación errónea sobre la ejecución de los demás, basada en un criterio personal que ya adolece de sustento; (d) el paradójico efecto de que, mejorando la ejecución propia con retroalimentación especializada por parte de un experto, se mejora (en términos de verosimilitud) tanto la autoevaluación como la hetero-evaluación hacia los demás. Se analizó los datos de la ejecución de tres clases que compartían el mismo temario en Psicología de la Educación en grados distintos. Se analizó el instrumento de evaluación y, en base a los resultados, se ha analizado las acciones emprendidas por los alumnos en base a sus resultados. Las conclusiones se discuten precisamente tomando en cuenta el efecto Dunning-Kruger y con ramificaciones en la generación millennial; analizando el sistema educativo, las políticas universitarias y las repercusiones que tiene en la sociedad el no disponer de profesionales verdaderamente formados.
La economía ha sido objeto de una creciente formalización matemática, lo que ha llevado a que muchos la consideren como una ciencia exacta. Esta visión, aunque metodológicamente valiosa en términos de consistencia lógica, plantea interrogantes fundamentales sobre la naturaleza de las entidades y modelos que emplea. ¿En qué medida los conceptos económicos reflejan fenómenos reales o son simplemente construcciones abstractas? ¿Es posible considerar la economía una disciplina que describe con precisión los aspectos de la realidad? Abordar estas preguntas requiere un enfoque gnoseológico, es decir, un análisis de los fundamentos del conocimiento económico y sus implicaciones ontológicas. El presente artículo tiene como objetivo reflexionar sobre el carácter de ciencia de la disciplina económica. Para ello, se exploran los tres modos de existencia explicados por Romero (2020)-material, formal y ficcional-como marco para clasificar las entidades económicas y analizar los modelos matemáticos desde una perspectiva crítica. En particular, se discuten las implicaciones de la existencia formal en la economía y se evalúan los retos que enfrenta la disciplina al intentar reconciliar coherencia formal con relevancia empírica.
This article introduces the use of AI-replicas as an alternative to traditional anonymisation methods in image-based qualitative research. It emphasises the ethical and practical dilemmas posed by current anon-ymisation methods, such as distortion or loss of emotional and contextual information in images, and proposes the use of AI-replicas to preserve the integrity and authenticity of visual data while ensuring participant anonymity. The article outlines the technological foundations of generative artificial intelligence (AI) and the practical application of Stable Diffusion to generate AI-replicas for anonymisation and fiction-alisation purposes. Furthermore, it discusses the potential biases present in generative AI to suggest ways to mitigate these biases through careful prompt engineering and participatory approaches. The introduced approach aims to enhance ethical practices in visual research by providing a method that ensures participant anonymity without compromising the data's qualitative richness and interpretative validity. Ethical and research-practical challenges in using images for qualitative research Over the last three decades, there has been a growing interest in the visual dimensions of social life, and research methods that utilise visual data of some kind have become legitimate and
This indispensable collection provides extensive, yet accessible, coverage of conceptual and practical issues in research design in personality and social psychology. Using numerous examples and clear guidelines, especially for conducting complex statistical analysis, leading experts address specific methods and areas of research to capture a definitive overview of contemporary practice. Updated and expanded, this third edition engages with the most important methodological innovations over the past decade, offering a timely perspective on research practice in the field. To reflect such rapid advances, this volume includes commentary on particularly timely areas of development such as social neuroscience, mobile sensing methods, and innovative statistical applications. Seasoned and early-career researchers alike will find a range of tools, methods, and practices that will help improve their research and develop new conceptual and methodological possibilities. Supplementary online materials are available on Cambridge Core.
This chapter presents two main approaches to the “project” of research; hypothesis-based research, and the somewhat opposite hypothesis-free research. We present the benefits and drawbacks of hypothesis-based research, showing how it provides a clear direction for research, yet all too often limits creativity and exploration. Additionally, we will discuss how hypothesis-based research has been used in various fields, such as medicine or psychology. We then contrast hypothesis-based to hypothesis-free research, an approach which allows for more creativity and exploration, yet seems to be less structured, and somewhat less scientific. We finish the chapter with an attempt to combine the two ways of thinking, such as using a hypothesis-based approach to guide initial research and then allowing for hypothesis-free exploration. The opposite could be done as well, using a hypothesis-free exploration to generate hypotheses.
1960’lı yılların sonlarında kantitatif coğrafyaya alternatif olarak, insan-mekân ilişki ve deneyimlerini anlama noktasında, coğrafyacılar arasında nitel yönteme bir merak uyanmıştır. Gerek bireyin veya toplumun davranışlarını daha iyi anlamak gerekse olaylara bütüncül bakmak amacıyla son yıllarda özellikle beşeri coğrafyacıların nitel yönteme olan ilgisi hiç olmadığı kadar artmıştır. Bu çalışmada, nitel araştırma yönteminin coğrafya disiplini içindeki yeri ve öneminden bahsedilmektedir. Bugünlerde beşeri coğrafyacıların genellikle bireylerin mekânla etkileşimini ve mekândaki deneyimlerini anlama noktasında nitel yönteme başvurdukları görülmektedir. Nitel yöntemden faydalanarak çalışmalarını daha güçlü bir yorumla açıklayabilmektedirler. Çalışmanın amaçlarından biri, son 50 yılda sosyal bilimler alanında etkinliğini giderek arttıran nitel araştırmaların beşeri coğrafya içinde nasıl konumlandırıldığını tartışmaktır. Bunun yanında, nitel araştırmanın beşeri coğrafyaya teorik ve metodolojik katkılarından bahsedilmektedir. Çalışmanın sonucu, bu araştırma türünün beşeri coğrafyada her zamankinden daha fazla görünür hale geldiğini ve bu yöntemin coğrafya bilimine önemli katkılar sunduğunu göstermektedir. Ayrıca coğrafya disiplini içinde nitel araştırmalara katkıda bulunan çalışmaların sayısının az olmadığı görülmektedir.
ResearchGate has not been able to resolve any references for this publication.