Article

The Function of Measurement in Modern Physical Science

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Nesses moldes, a atividade científica é uma atividade racional, que se distingue de outras formas de pensamento pela sua característica em separar os interesses sociais e coletivos dos resultados obtidos pela prática científica. (Feyerabend, 1984 Neste sentido, a consideração de Kuhn (1961) Por outro lado, a adoção da verdade como um valor essencial para a prática científica pode levar a uma rigidez dogmática ou ideológica da ciência dentro da sociedade. Historicamente, tanto a filosofia natural 7 quanto a ciência moderna foram tidas como atividades de iluminação contra outros tipos ideológicos de explicações previamente existentes. ...
... Esses livros e materiais didáticos assumem a função de divulgação do conhecimento científico, utilizados na aproximação das pessoas às informações validadas que estão na fronteira do conhecimento, bem como na inserção dos estudantes a um determinado ramo científico.Esses livros didáticos e materiais educativos apresentam tabelas, gráficos e outros elementos visuais como se constituíssem testes relevantes para a teoria em questão. Neste sentido,Kuhn (1961) destaca que esses elementos normalmente concordam com a teoria em questão, cuja presença somente reforça a crença na validade dessa teoria pela comunidade científica. ...
... Na prática científica, essa posição é assumida em relação às teorias compartilhadas e aceitas pela comunidade científica. Os cientistas assumem a posição complexa e minuciosa da base disponível de acordo com os avanços teóricos mais recentes, proporcionando os avanços subsequentes(KUHN, 1961). De fato, não é possível realizar uma justificativa empírica para as asserções científicas, o que leva somente a posição de preferir tentativamente uma teoria em face de outra. ...
Book
Full-text available
Em todos os níveis educacionais, os manuais científicos e os materiais didáticos apresentam as teorias científicas e os conhecimentos técnicos como informações validadas pela ciência e com bom grau de confiança dentro de cada campo científico, sugerindo o estágio mais avançado do conhecimento. Em essência, a ciência busca explicações verdadeiras sobre os fenômenos observáveis, porém a partir de uma perspectiva falibilista, o empreendimento científico não possui meios para garantir a sua veracidade, trabalhando apenas com teorias aproximadamente verdadeiras. O conhecimento científico existe, mas é provisório. O problema emerge quando os livros didáticos e os manuais científicos apresentam essas teorias não como algo provisoriamente aceito pela comunidade científica, mas como conhecimentos quase inquestionáveis, cuja disseminação adota uma postura dogmática e doutrinária. Nos moldes como a educação está instaurada em nossa sociedade, ela surge como um instrumento institucionalizado pelo qual, desde as crianças até os especialistas, a memorização do conhecimento é priorizada frente à liberdade de pensamento e ao estímulo ao raciocínio crítico, pela utilização desses materiais e livros científicos. A conclusão deste trabalho sugere que, uma vez que não há maneiras de garantir a veracidade de uma teoria científica, apenas seu caráter provisório pela boa confiança em seus resultados vigentes, os materiais didáticos, independentemente do nível educacional, devem apresentar as teorias científicas a partir da visão falibilista da ciência, buscando o estímulo ao racionalismo crítico, à liberdade de pensamento e ao poder criativo.
... Это то, что Т. Кун определяет как парадигменные исследования [10][11][12][13][14][15][16]. Типовые работы. ...
... Вспоминаем Т. Куна и его концепцию [10][11][12][13][14][15][16]. Важны его идеи относительно решения задач -головоломок. ...
Preprint
Full-text available
The central concept of infectious ecology posits that an infectious disease is a manifestation of a phenomenon occurring at a distinct level of natural order, akin to a shadow. By exclusively examining the infectious shadow, we are limited in our ability to get a substantial quantity of knowledge pertaining to the underlying process responsible for its creation. Relying only on this will inevitably lead to total cognitive failure. Water is the primary habitat for the occurrence of pathogenic microorganisms. By applying the cognitive framework of infectious ecology, it is possible to develop a completely new classification system for freshwater. The perspective of microorganisms and the expression of pathogenicity is taken into consideration. The scientific community's approach to infection ecology is characterized by a conspicuous lack of communication. There is a lack of in-depth analysis and debate around the concept. Lack of reaction to criticism. What is the issue? An in-depth analysis is conducted on the reasoning of proponents of dogmatic epidemiology. Epidemiologists and microbiologists have been conducting extensive searches for species - reservoir due to their prolonged quest for understanding the underlying reasons. They work with an exceptionally ambiguous concept. How is it possible for this to occur in contemporary scientific research? Adherents of the prevailing paradigm fail to consider the constraints of their methodology. It is universally acknowledged as the sole and entirely conventional choice for scientific study. They disregard all scientific alternatives. An illustrative case for this study is the rationality of formulating and addressing the inquiry on the many categories of species - reservoir linked to the COVID-19 pandemic. There exist two essentially distinct perspectives on pandemic processes. First, the process of the pandemic revolves around a singular focal point. Infection dissemination originates at this point. At a specific point, it transitions into a pandemic. Various biological species have the potential to be infected. Second, the pandemic process lacks a singular focal point. We are engaged in a polycentric process. There is indeed a natural spread of illness among populations of specific species, but its impact is restricted. The aquatic environment is the most probable setting for pathogenicity to manifest. The reasons can be attributed to physical and chemical signals. They result in a temporary expression of the of the pathogenic properties of certain microorganisms. Within the framework of dogmatic epidemiology, there is no way to explain mass infectious processes. For this reason, it is suggested to spend expert time on pseudo problems. The reasons for such a long pandemic process as COVID-19 can be explained on the basis of the adaptation hypothesis, a system of terms and concepts used in infectious ecology. The adaptation hypothesis needs development. It is necessary to carry out expeditionary and experimental work.
... Despite popular depictions of research as an objective pursuit of truth, we recognize research as social praxis-a process inherently shaped by contemporary norms and historical context (Hacking, 1983;Hochstein, 2019;Kuhn, 1961). Early in the development of a field, certain norms and design choices can help scope feasible research questions, providing an important starting point for researchers and scientists. ...
Preprint
Full-text available
Algorithmic fairness has emerged as a critical concern in artificial intelligence (AI) research. However, the development of fair AI systems is not an objective process. Fairness is an inherently subjective concept, shaped by the values, experiences, and identities of those involved in research and development. To better understand the norms and values embedded in current fairness research, we conduct a meta-analysis of algorithmic fairness papers from two leading conferences on AI fairness and ethics, AIES and FAccT, covering a final sample of 139 papers over the period from 2018 to 2022. Our investigation reveals two concerning trends: first, a US-centric perspective dominates throughout fairness research; and second, fairness studies exhibit a widespread reliance on binary codifications of human identity (e.g., "Black/White", "male/female"). These findings highlight how current research often overlooks the complexities of identity and lived experiences, ultimately failing to represent diverse global contexts when defining algorithmic bias and fairness. We discuss the limitations of these research design choices and offer recommendations for fostering more inclusive and representative approaches to fairness in AI systems, urging a paradigm shift that embraces nuanced, global understandings of human identity and values.
... Statistically variables are either latent (problematic) or not (unproblematic), but some realists (particularly Sayer [53]) argue it's more of a spectrum. All measurement is theory-laden; that is, predicated upon theoretical assumptions made by researchers when designing their studies and choosing what and how to measure [57,34]. For example, to count the bugs found by the BugFinder3000 we need a theory of what is and is not a bug. ...
Preprint
While the methodological rigor of computing research has improved considerably in the past two decades, quantitative software engineering research is hampered by immature measures and inattention to theory. Measurement-the principled assignment of numbers to phenomena-is intrinsically difficult because observation is predicated upon not only theoretical concepts but also the values and perspective of the research. Despite several previous attempts to raise awareness of more sophisticated approaches to measurement and the importance of quantitatively assessing reliability and validity, measurement issues continue to be widely ignored. The reasons are unknown, but differences in typical engineering and computer science graduate training programs (compared to psychology and management, for example) are involved. This chapter therefore reviews key concepts in the science of measurement and applies them to software engineering research. A series of exercises for applying important measurement concepts to the reader's research are included, and a sample dataset for the reader to try some of the statistical procedures mentioned is provided.
... Все это и многое иное делается при становлении новой парадигмы. Здесь нет ничего нового [2][3][4][5][6][7][8]. ...
Preprint
Full-text available
Detailed critical analysis of the epidemiological standard of the study of pasteurellosis. The case of the death of Loxodonta africana in Zimbabwe and the mass death of S. t. Tatarica in Asia were considered. The cognitive limitations of traditional epidemiology are shown. Epidemiologists is shockingly ignoring the standards of modern science for collecting data on water, soil and vegetation. All this information is absolutely necessary to understand cases of mass deaths of animals from pasteurellosis. The failure to use digital mapping to collect and process data on infectious incidents was discussed in detail. The reason for ignoring the achievements of modern science is the incredibly simplified understanding of nature by epidemiologists and the ignorance of the fact of the impact of the natural environment on the manifestations of pasteurellosis. In traditional epidemiology, the diagnosis of animal death cannot go any further. There is no explanation for the occurrence of mass pasteurellosis. Work is not just critical. The cognitive standard of infectious ecology is examined in detail precisely for the case of mass death from pasteurellosis. In infectious ecology, the working hypothesis of mass deaths of S. t. Tatarica, S.t. tatarica mongolica and Loxodonta africana from pasteurellosis is based on the periodic appearance of an invasive physico-chemical signal leading to hyper-virulence. As a result, there is an infectious disease that is fatal. The direct impact of invasive physico-chemical signals on microorganisms with pathogenic properties causes infectious diseases. Including Pasteurella multocida. Possible detailed study of the problem with fundamentally new scientific positions. A new standard for field and experimental work on pasteurellosis has been introduced. Infectious ecology allows you to answer a large number of questions that were previously left unanswered or not even formulated in epidemiology and veterinary medicine. These questions are related to the emergence of MASSIVE infectious processes. Keywords: S. t. Tatarica, S. tatarica mongolica, Loxodonta africana, pasterellosis, morphological hypothesis, Pasteurella multocida, emergence of massive infectious processes.
... The point is not whether the data fit the model perfectly; the point is whether the approximation can be made useful [16], [18]. Contrary to popular opinion, measurement is not performed for the purpose of discovering laws; rather, the ability to measure is a function of the laws already embedded within instruments [28]. ...
Article
Full-text available
Data from the Kering Group's 2018 Environmental Profit and Loss (EP&L) statement were examined for their capacity to meet the demand for meaningful and manageable sustainability metrics. Significant resources were invested in creating the data reported in this EP&L statement, as Kering's operations in 104 countries were evaluated in ways separable into almost 1,500 different indicators. The data system was not, however, designed as a measurement system. That is, it was not set up as specifically positing the possibility of estimating separable parameters for comparing company location performances across sustainability challenges. Of particular importance is the lack of information in the EP&L on the overall consistency of the data reported, on the uncertainties associated with the metrics given, and on the meaningfulness of comparisons across environmental impacts, processes, and materials. The results reported here showing far from perfect data consistency and large uncertainties comprise an effort at constructing meaningful measurements that offers important lessons for the redesign of the data and reporting system.
... 10 If it is maintained that properties of objects do not inherently have values, the representation may be chosen according to different criteria and is required to be at least consistent: if properties of objects are observed to be ordered then their assigned values should be ordered in turn, but any ordered set would be suitable to perform such a purely symbolic task, and so on. However, a stronger ontology invites interpretation of advancements in measurement-related knowledge and practices as an evolutionary process: at the beginning the available information could be so "meager and unsatisfactory" (quoting Lord Kelvin; see the related discussion in Kuhn, 1961) that the evaluation results are more or less everything that is known of the considered property, and therefore consistency in the representation is the only condition that can be sought. Such an approach could be later abandoned with the acquisition of more and better information, leading to corroboration of the hypothesis of the very existence of the property, up to the extreme position that the measurand has a knowledge-independent true value, to be estimated through measurement. ...
Chapter
Full-text available
This chapter aims to explore some key components of an ontology and an epistemology of properties. What is evaluated, and more specifically, measured, are properties of objects, such as lengths of rigid bodies and reading comprehension abilities of individuals, and the results of evaluations, and thus of measurements, are values of properties. Hence, a study of the nature of properties and of our ways of securing knowledge of them is a pivotal component of measurement science. We start from the hypothesis that properties of objects are associated with modes of empirical interaction of the objects with their environment. Consistently with the model-dependent realism introduced in Chap. 4, the Basic Evaluation Equationof which the relationis a specific case, is interpreted as a claim of an actual referential equality, which conveys information on the measurand because the measurand and the measured value remain conceptually distinct entities.
... Capturing complex phenomena in a single (or a few) controlled quantifiable variable(s) can be misleading since this imposes certain constraints on results and may neglect important findings (Weber, 2004;Keiller, 2005). Cicourel (1964) and Kuhn (1961) argue that the weaknesses of positivism have paved the way for a new paradigm which suggests that "all knowledge is socially constructed and a product of particular historical context within which it is located" (Oliver, 1992, p. 106). Any social science research should endeavour to understand the meanings of phenomena, causes, effects and values developed within that social phenomenon. ...
Article
There is a germane relationship between qualitative and quantitative approaches to social science research. The relationship is empirically and theoretically demonstrated by poverty researchers. The study of poverty, as argued in this article, is a study of both numbers and contextualities. This article provides a general overview of qualitative and quantitative approaches to poverty studies and argues that only a combination of the two approaches, where necessary, would provide a robust, rich and reliable data for researching issues of poverty. Hence, the contemporary drive towards a mixed methods approach in poverty research is not only welcomed but certainly timely as well. Thus, understanding ontological and epistemological paradigms about social sciences is imperative in dousing such tensions.
... Как правило, учёные занимаются рутинными исследованиями. То, что по терминологии Т. Куна определяется как «нормальная наука» [6][7][8][9][10][11][12]. Никто и ничего менять не собирается. ...
Preprint
Full-text available
We are dealing with a fundamental novelty. It is connected with the existence of an openly invasive concept, focused on the purposeful destruction of the Westphalian order. We are talking about the Russian concept of relativistic political geography. It cannot be defined as scientific knowledge. But this is a completely sufficient scientific basis for aggression against other post-Soviet states. The authorship of the concept is collective. There are 5-7 experts who can be classified as authors. The place of origin of the concept is associated with a number of Institutes of the Russian Academy of Sciences. At the beginning of 2023, there is an expert community that works within the framework of this concept. About 50-60 people at the level of PhD in various sciences. At least 15 professors. They are connected precisely with the development of the concept and its application in various directions. The structure of this expert community is not rigidly centralized. It does not repeat the structure of the academic institutions of the Russian Federation. This structure resembles the structure of the Mexican drug cartel Calais. In Putin's Russia, there is a very specific version of the organization of science. This is something new. The effectiveness of the novelty is still difficult to assess. The relativistic concept of political geography and some other concepts of this type are part of Russian propaganda. They are categorically not focused specifically on explicit use. Such concepts are the basis for the activity of a huge number of propagandists of Putin's Russia. They are at the basis of all geographical education in the Russian Federation (schools and universities). Forms the basic geographical worldview of the Russian population. Before the war of 2022, the relativistic concept of political geography was difficult to discern for many reasons. Probably something similar took place in the case of Nazi science. Totalitarian states that have decided to expand their territories of control have much in common in the organization of science. The war of the Russian Federation against Ukraine, launched in 2022, is one of the examples of the consistent implementation of the concept of relativistic political geography. The shocking absurdity of this war becomes much clearer when a close examination of the relativistic conception of political geography is made. The geography of this war also becomes much more understandable. It is necessary to change the terms of evaluation of what is happening. We are dealing with a fundamentally new type of war, and not with senseless and chaotic decisions. The 2022 war of the Russian Federation against Ukraine is not only, and maybe not so much a war of the Russian Federation, against one neighboring state. This is a new type of war that is based on a new philosophy. It is focused on the progressive deterioration of the functioning of the Westphalian system of world order. To what extent will the new philosophy of war be given the opportunity to develop? This has long-term consequences. From my point of view, the response to the destructive activity of the Russian Federation, both scientific and practical, should be consistent and radical. I evaluate the development of such concepts as a war crime, by substantiating a new type of war.
... As has been established at least since the work of Kuhn [182,183] and Toulmin [323,324], methods always entail presuppositions that cannot be explicitly formulated and tested. The sense of method that focuses on following rules must necessarily always fall short in its efforts at explanatory power and transparency [137]. ...
Chapter
Full-text available
An historic shift in focus on the quality and person-centeredness of health care has occurred in the last two decades. Accounts of results produced from reinvigorated attention to the measurement, management, and improvement of the outcomes of health care show that much has been learned, and much remains to be done. This article proposes that causes of the failure to replicate in health care the benefits of “lean” methods lie in persistent inattention to measurement fundamentals. These fundamentals must extend beyond mathematical and technical issues to the social, economic, and political processes involved in constituting trustworthy performance measurement systems. Successful “lean” implementations will follow only when duly diligent investments in these fundamentals are undertaken. Absent those investments, average people will not be able to leverage brilliant processes to produce exceptional outcomes, and we will remain stuck with broken processes in which even brilliant people can produce only flawed results. The methodological shift in policy and practice prescribed by the authors of the chapters in this book moves away from prioritizing the objectivity of data in centrally planned and executed statistical modeling, and toward scientific models that prioritize the objectivity of substantive and invariant unit quantities. The chapters in this book describe scientific modeling’s bottom-up, emergent and evolving standards for mass customized comparability. Though the technical aspects of the scientific modeling perspective are well established in health care outcomes measurement, operationalization of the social, economic, and political aspects required for creating new degrees of trust in health care institutions remains at a nascent stage of development. Potentials for extending everyday thinking in new directions offer hope for achieving previously unattained levels of efficacy in health care improvement efforts.
... In the natural sciences, advanced measurement is derived from the scientific, theoretical understanding of the relevant variables and their relationships [26]. Direct reading of measurements from the instrument hides the substantive theory and design that manifests the property measured and controls properties that disturb it. ...
Chapter
Full-text available
The concept of measurement in which the magnitude of a property is quantified in a common unit relative to a specified origin is a deep abstraction. This chapter shows the application of measurement in a social science context where the motivation is transparency and equity rather than the advancement of scientific laws. However, to achieve these, the realization of measurement needs to be no less rigorous than it is in the advancement of scientific laws. Rasch measurement theory provides the basis for such rigor. The context in this chapter is competitive selection into universities in Western Australia based on a summary performance on a series of instruments which assess achievement in a range of discipline areas. Such selection tends to determine life opportunities; therefore to ensure consistency and fairness, performances on different instruments need to be transformed into measurements which are in the same, explicit unit relative to a specified origin. Because the illustrative context is complex, it is considered that the Rasch measurement theory applied in this chapter could be applied to a range of social contexts where assessments on different instruments need to be transformed to measurements in a common unit referenced to a common origin and where the focus is on making decisions at the person level.
Article
Full-text available
This paper challenges “biodiversity skepticism:” an inferential move that acknowledges the proliferation, heterogeneity, and lack of covariance of biodiversity measurements, and concludes that we should doubt the scientific validity of the biodiversity concept. As a way out of skepticism, philosophers have advocated for eliminating “biodiversity” from scientific inquiry, revising it, or deflating its meaning into a single measurable dimension. I present a counterargument to the inferential move of the skeptic by revealing how it stands on two unstated premises, namely a reflective view of measurements and the unidirectional dynamics between definitional and measurement practices, and corollary assumptions. These premises and assumptions are misaligned with a richer theoretical understanding of measurement and are sometimes inconsistent with how science operates. A more nuanced view of measurement could better explain measurement proliferation while being consistent with new ways in which the general biodiversity concept could be useful. To conclude, I urge philosophers of measurement and conceptual engineers to collaborate in tackling the interplay between conceptual change and measurement practices.
Article
Full-text available
Through interviews with workers at the Ljubljana District Court Condominium Department, the article presents their work and practical knowledge. It shows that practical knowledge includes not only embodied skills but also internalised cognitive schemata, manners of reasoning, and knowledge of explicit information. Furthermore, practical knowledge is not merely the knowledge of an individual. Through shared expressions, forms, models, conversations and meetings, it develops as a collective knowledge of a given workplace – a common culture of employees who under similar working conditions experience similar life situations. The first of two articles analyses the division of labour as well as cognitive schemata and manners of reasoning that employees master through their work. It begins by outlining the division of labour, management structure, organisation of work, different categories of law workers, their assignments, work quotas, and techniques of workplace monitoring. I focus on five main categories of employees: judges, judicial assistants, court clerks, typists, and registrar workers. The other two chapters analyse the forms of practical knowledge overlooked in theoretical approaches that have reduced practical knowledge to embodied knowledge and habitus. First, I present the cognitive schemata or types developed and internalised by different categories of employees, such as ‘common area’, ‘fictitious co-ownership’, etc. Second, I highlight that employees also master through their work explicit information and manners of reasoning, like the content of statutory provisions and connections between them, which they then effortlessly apply in their practice. As opposed to embodied knowledge, which cannot be made explicit, these forms of practical knowledge can be worded, codified, and learned through conversation or reading. They thus challenge the assumption of embodied knowledge as a model for all practical knowledge. In addition the schemata, information, and manners of reasoning, which employees internalise, vary according to the material conditions of their work, because division of labour leads them to experience different concrete situations in which they acquire and use their knowledge.
Book
Full-text available
[[COMPLETE SECONDE ISSUE OF METASCIENCE]] This second issue of the journal Mεtascience continues the char- acterization of this new branch of knowledge that is metasci- ence. If it is new, it is not in a radical sense since Mario Bunge practiced it in an exemplary way, since logical positivists were accused of practicing only a mere metascience, since scientists have always practiced it implicitly, and since some philosophers no longer practice philosophy but rather metascience, but without characterizing it or theorizing it, that is, without realizing that they have abandoned one general discourse for another. The novelty therefore lies in this aware- ness that a general discourse without philosophy is possible: a scien- tific general discourse. The twelve contributions gathered in this volume illustrate the metascientific approach to knowledge of the world as well as to knowledge of knowledge of the world, that is, science. And like Bunge’s project, they are neither part of the analytical movement nor the continental movement in philosophy. We will read here studies about the Bungean system, some applications of Bungean thought, some metascientific contributions, and some reflections around meta- science. Among metascientific disciplines, ontology occupies a prominent place in this issue of Mεtascience. Metascience differs from philoso- phy in its rejection of the fundamental philosophical distinction be- tween appearance and reality. Metascientific ontology therefore does not postulate the existence of any metaphysical reality. But metasci- entific ontology, no more than philosophical ontology, is a factual sci- ence. The first, because it studies scientific constructs and not concrete objects, the second, because it is interested in transcendent or meta- physical objects.
Chapter
Full-text available
Theory and experiment went hand in hand in the work of Lord Rayleigh, in which the quest for rigor was a ubiquitous theme. To Rayleigh’s mind, though, and in contrast to mathematicians, physicists could proceed in their investigations without seeking absolute rigor. In his experimental practice, pursuing rigor involved the application of control strategies, which pervaded his work at various levels. Moreover, experimental control had various aims, such as standardizing measurement units in determining the ohm and validating experimental results in the discovery of argon. In the former case, Rayleigh and his team varied the design of their apparatus to control the experimental conditions. Dealing with errors was the main aim of their control practices and lay at the heart of their methodology. In the latter case, control was present in every step of the discovery process: the detection of discrepancies between the densities of atmospheric and “chemical” nitrogen, the identification of argon as a constituent of the atmosphere, and the subsequent exploration of its properties. The aim of this paper is to investigate and contrast the strategies of control employed in those two cases and to clarify their various purposes.
Chapter
Full-text available
In this chapter, we consider the factors that propel or hinder collaboration in the context of interdisciplinary research. We build on the current literature regarding the practical and unique elements of collaboration in interdisciplinary settings by examining the experiences of researchers via large-scale survey results and in-depth interviews. The chapter provides considerations and suggestions regarding aspects of collaborative work in interdisciplinary settings, including team size, collaboration experience, team-member roles, unique skills, and the influence of disciplines.
Article
Full-text available
Slovensko epistemologijo določa značilen kanon, ki izhaja iz treh temeljnih avtorjev: Gastona Bachelarda, Alexandra Koyréja in Thomasa Kuhna. Naveden kanon združuje drža, da je treba na zgodovino znanosti gledati kot na zgodovino radikalnih prelomov ali revolucij v znanstveni misli. Pomanjkljivost takšnega izbora avtorjev ni le njegova zastarelost, temveč da je znotraj njega težko razbrati probleme, ki jih poraja s tem kanonom utemeljen pristop k zgodovini znanosti. Da bi izpostavil slepe pege slovenske epistemologije, predstavim drugačno recepcijo pojma znanstvene revolucije v Združenem Kraljestvu in ZDA. Metodološki program, ki se osredotoča na revolucionarne prelome v znanstvenih idejah, torej ponovno ovrednotim skozi obravnavo njegove strateške vloge pri institucionalizaciji zgodovine znanosti kot discipline. Dogodek, ki je zaznamoval prevajanje francoske epistemologije v angleško zgodovino znanosti, so bila predavanja delegatov iz Sovjetske zveze na drugem Mednarodnem kongresu za zgodovino znanosti in tehnologije leta 1931. Povojna angloameriška zgodovina znanosti je namreč koncept znanstvene revolucije prevzela kot sredstvo izključevanja marksističnih študij znanosti, kakršne so se razširile pod vplivom sovjetske delegacije, iz polja sprejemljivega diskurza. V članku pokažem, kako je hladnovojno znanstveno zgodovinopisje uvedlo razlikovanje med »notranjimi« in »zunanjimi« dejavniki znanstvenega razvoja, ki naj bi sintezo domnevno doživeli v delu Thomasa Kuhna. Nazadnje obnovim kritike, ki jih je zoper zgodbo o znanstveni revoluciji uperila sociologija znanstvene vednosti, in izpostavim pomen teh polemik za aktualne epistemološke raziskave.
Article
Full-text available
Slovenian epistemology is characterised by an idiosyncratic canon, based on three fundamental authors: Gaston Bachelard, Alexandre Koyré, and Thomas Kuhn. What binds this canon together is the attitude that the history of science should be viewed as a history of radical breaks or revolutions in scientific thought. The drawback of such an anthology of authors is not only that it is outdated, but that, from the position of this canon, it is difficult to discern the problems stemming from the approach to history of science it endorses. In order to highlight the blind spots of Slovenian epistemology, I examine a different interpretative trajectory of the notion of scientific revolution in the United Kingdom and the United States. The methodological program, which focuses on revolutionary breakthroughs in scientific ideas, is thus re-evaluated by considering its strategic role in the institutionalisation of the history of science as a discipline. The event that marked the translation of French epistemology into English history of science were the lectures of the delegates from the Soviet Union at the Second International Congress of the History of Science and Technology in 1931. In post-war Anglo-American history of science, the concept of scientific revolution was adopted as a means of excluding Marxist studies of science – which had spread under the influence of the Soviet delegation – from the realm of acceptable discourse. I demonstrate how the Cold War historiography of science introduced a divide between the “internal” and “external” factors of scientific development, and their supposed synthesis in the work of Thomas Kuhn. Finally, I review the critiques that the sociology of scientific knowledge levelled at the story of the Scientific Revolution and point out the importance of these controversies for current epistemological research. /// Slovensko epistemologijo določa značilen kanon, ki izhaja iz treh temeljnih avtorjev: Gastona Bachelarda, Alexandra Koyréja in Thomasa Kuhna. Naveden kanon združuje drža, da je treba na zgodovino znanosti gledati kot na zgodovino radikalnih prelomov ali revolucij v znanstveni misli. Pomanjkljivost takšnega izbora avtorjev ni le njegova zastarelost, temveč da je znotraj njega težko razbrati probleme, ki jih poraja s tem kanonom utemeljen pristop k zgodovini znanosti. Da bi izpostavil slepe pege slovenske epistemologije, predstavim drugačno recepcijo pojma znanstvene revolucije v Združenem Kraljestvu in ZDA. Metodološki program, ki se osredotoča na revolucionarne prelome v znanstvenih idejah, torej ponovno ovrednotim skozi obravnavo njegove strateške vloge pri institucionalizaciji zgodovine znanosti kot discipline. Dogodek, ki je zaznamoval prevajanje francoske epistemologije v angleško zgodovino znanosti, so bila predavanja delegatov iz Sovjetske zveze na drugem Mednarodnem kongresu za zgodovino znanosti in tehnologije leta 1931. Povojna angloameriška zgodovina znanosti je namreč koncept znanstvene revolucije prevzela kot sredstvo izključevanja marksističnih študij znanosti, kakršne so se razširile pod vplivom sovjetske delegacije, iz polja sprejemljivega diskurza. V članku pokažem, kako je hladnovojno znanstveno zgodovinopisje uvedlo razlikovanje med »notranjimi« in »zunanjimi« dejavniki znanstvenega razvoja, ki naj bi sintezo domnevno doživeli v delu Thomasa Kuhna. Nazadnje obnovim kritike, ki jih je zoper zgodbo o znanstveni revoluciji uperila sociologija znanstvene vednosti, in izpostavim pomen teh polemik za aktualne epistemološke raziskave.
Chapter
Full-text available
This chapter aims to outline the technical and cultural contexts in which measurement systems, as presented in the previous chapter, are designed, set up, and operated. It first introduces the basic proposal that a measurement should produce as result not only one or more values of the property under consideration but also some information on the quality of those values and discusses the consequences in terms of measurement uncertainty. This proposal is then embedded in the broader context of metrological systems, which help justify the societal significance of measurement results via their traceability to conventionally defined measurement units, so that measurement results can be interpreted in the same way by different persons in different places and times. Finally, we consider the issue of what is measured, i.e., the property of a given object, or measurand , which must be somehow defined and identified. On this basis, the chapters that follow develop and bring further specificity to our analysis and proposals. As with the previous chapter, we believe that the contents of this chapter should be sufficiently uncontroversial to be read and accepted by most, if not all, researchers and practitioners.
Article
Nem újdonság, hogy a közgazdaságtan elmélettörténetét a legtöbb közgazdász lenézi. Szinte évente rendeznek szakmai vitákat az eszmetörténet intézményi szerepéről; mi több, egy 2001-es konferencián a terület ingatag állapotát dokumentálták Észak-Amerikában és még veszélyeztetettebb helyzetét az Egyesült Királyságban és Ausztráliában (Weintraub 2002b). A Duke Egyetem kivételével Észak-Amerika egyetlen „elit” egyetemén sincs köz- gazdaságtan-elmélettörténetből nemhogy rendszeres kurzus, de még program sem (Gayer 2002).
Article
Full-text available
The study is a descriptive research design aimed at studying the pattern of the distribution of weight and height among the; students of Demonstration Secondary School Eziagu (the secondary school arm of Federal College of Education Technical, Umunze). The students' weight and height provide complete variable necessary for calculating Body Mass Index (BMI) also known as 'quetelet index' which reveal more about one's overall health risks. The students were aged ten (10) to twenty two (22) years. All students of the School live in the school hostels and share common feeding and similar school organized games and some other lifestyles, with the exception of the fast or junk foods which students came with from their homes to supplement school foods. Eight hundred and thirteen students with four hundred and forty three males (443) and three hundred and seventy(370) females were used. This figure comprised the total number of students available during the data collection. Medical weighing scales with mechanical height rods (stadiometres) were used in weighing and measuring in one step the height and weight of students. The result was analyzed using mean, standard error of mean, skewness, kurtosis, independent sample t-test, and Kolmogorov-Smirnov Z-t test. The analysis showed that (1)the mean height and weight of the students was 158.70cm and 48.85kg respectively for height and weight (2) the mean height and weight of males was 159.50cm and 48.73kg respectively (3) the mean height and weight of females was 157.74cm and 49.00kg respectively (4) males differ significantly with females in their height but not in their weight at 0. 05 level of significance (5) the distribution of height does not differ • significantly from normality(Kolmogorov-SmirnovZ = 1.13, p = 0.16 > 0.05) however, the distribution of weight differed significantly from normality (Kolmogorov-SmirnovZ = 1.85, p = 0.002 < 0.05).
Article
Hollywood applauds Glendon Schubert's script for a possible soap opera in his pre-view of Approaches to the Study of Political Science . Stereotypic characters, miscasting of actors, words put in the mouths of fictitious persons, and maudlin grief are all present in his poignant ad hominen remarks, while a task which I regard as important — the development of a science of politics — is curiously overlooked by Mr. Schubert as the main purpose of the volume. And, in a manner similar to David Easton's eloquent 1969 presidential address, it is important to tune out Mr. Schubert's solipsistic One Man's Family and to discuss instead the real reasons for all the current fuss about postbehavioral options. My own view is that political science has achieved considerable maturity as a discipline in recognizing a fundamental symbiosis between three facets of science as the 1970's begin. At one level, a scientist may seek to describe an individual case, to calibrate measuring Instruments, and to engineer specific changes in the real world. At a second level a scientist can search for relationships between two or more variables across several cases in order to state generalizations that will serve as guides to the future and to cases as yet unexamined. Yet myriad generalizations do not cumulatively add up to higher and higher levels of scientific achievement until we consider a third facet of science, wherein one seeks analytical explanations for empirical findings and smooths out the idiosyncracies of particular research investigations into analytically parsimonious paradigms, models, and theories concerning how the world is put together. These three levels or types of science may be called clinical, empirical , and theoretical , respectively.
Article
Thomas Kuhn's The Structure of Scientific Revolutions offers an insightful and engaging theory of science that speaks to scholars across many disciplines. Though initially widely misunderstood, it had a profound impact on the way intellectuals and educated laypeople thought about science. K. Brad Wray traces the influences on Kuhn as he wrote Structure, including his 'Aristotle epiphany', his interactions, and his studies of the history of chemistry. Wray then considers the impact of Structure on the social sciences, on the history of science, and on the philosophy of science, where the problem of theory change has set the terms of contemporary realism/anti-realism debates. He examines Kuhn's frustrations with the Strong Programme sociologists' appropriations of his views, and debunks several popular claims about what influenced Kuhn as he wrote Structure. His book is a rich and comprehensive assessment of one of the most influential works in the modern sciences.
Chapter
Full-text available
We argue that a goal of measurement is general objectivity: point estimates of a person’s measure (height, temperature, and reader ability) should be independent of the instrument and independent of the sample in which the person happens to find herself. In contrast, Rasch’s concept of specific objectivity requires only differences (i.e., comparisons) between person measures to be independent of the instrument. We present a canonical case in which there is no overlap between instruments and persons: each person is measured by a unique instrument. We then show what is required to estimate measures in this degenerate case. The canonical case encourages a simplification and reconceptualization of validity and reliability. Not surprisingly, this reconceptualization looks a lot like the way physicists and chemometricians think about validity and measurement error. We animate this presentation with a technology that blurs the distinction between instruction, assessment, and generally objective measurement of reader ability. We encourage adaptation of this model to health outcomes measurement.
Chapter
Full-text available
In an argument whereby, “… individual-centered statistical techniques require models in which each individual is characterized separately and from which, given adequate data, the individual parameters can be estimated”.
Chapter
Full-text available
The field of career education measurement is in disarray. Evidence mounts that today’s career education instruments are verbal ability measures in disguise. A plethora of trait names such as career maturity, career development, career planning, career awareness, and career decision making have, in the last decade, appeared as labels to scales comprised of multiple choice items. Many of these scales appear to be measuring similar underlying traits and certainly the labels have a similar sound or “jingle” to them. Other scale names are attached to clusters of items that appear to measure different traits and at first glance appear deserving of their unique trait names, e.g., occupational information, resources for exploration, work conditions, personal economics. The items of these scales look different and the labels correspondingly are dissimilar or have a different “jangle” to them.
Chapter
Full-text available
Growth in reading ability varies across individuals in terms of starting points, velocities, and decelerations. Reading assessments vary in the texts they include, the questions asked about those texts, and in the way responses are scored. Complex conceptual and operational challenges must be addressed if we are to coherently assess reading ability, so that learning outcomes are comparable within students over time, across classrooms, and across formative, interim, and accountability assessments. A philosophical and historical context in which to situate the problems emerges via analogies from scientific, aesthetic, and democratic values. In a work now over 100 years old, Cook's study of the geometry of proportions in art, architecture, and nature focuses more on individual variation than on average general patterns. Cook anticipates the point made by Kuhn and Rasch that the goal of research is the discovery of anomalies—not the discovery of scientific laws. Bluecher extends Cook’s points by drawing an analogy between the beauty of individual variations in the Parthenon’s pillars and the democratic resilience of unique citizen soldiers in Pericles’ Athenian army. Lessons for how to approach reading measurement follow from the beauty and strength of stochastically integrated variations and uniformities in architectural, natural, and democratic principles.
Chapter
Full-text available
One must provide information about the conditions under which [the measurement outcome] would change or be different. It follows that the generalizations that figure in explanations [of measurement outcomes] must be change-relating… Both explainers [e.g., person parameters and item parameters] and what is explained [measurement outcomes] must be capable of change, and such changes must be connected in the right way (Woodward, 2003). Rasch’s unidimensional models for measurement tell us how to connect object measures, instrument calibrations, and measurement outcomes. Substantive theory tells us what interventions or changes to the instrument must offset a change to the measure for an object of measurement to hold the measurement outcome constant. Integrating a Rasch model with a substantive theory dictates the form and substance of permissible conjoint interventions. Rasch analysis absent construct theory and an associated specification equation is a black box in which understanding may be more illusory than not. The mere availability of numbers to analyze and statistics to report is often accepted as methodologically satisfactory in the social sciences, but falls far short of what is needed for a science.
Chapter
Full-text available
A metrological infrastructure for the social, behavioral, and economic sciences has foundational and transformative potentials relating to education, health care, human and natural resource management, organizational performance assessment, and the economy at large. The traceability of universally uniform metrics to reference standard metrics is a taken-for-granted essential component of the infrastructure of the natural sciences and engineering. Advanced measurement methods and models capable of supporting similar metrics, standards, and traceability for intangible forms of capital have been available for decades but have yet to be implemented in ways that take full advantage of their capacities. The economy, education, health care reform, and the environment are all now top national priorities. There is nothing more essential to succeeding in these efforts than the quality of the measures we develop and deploy. Even so, few, if any, of these efforts are taking systematic advantage of longstanding, proven measurement technologies that may be crucial to the scientific and economic successes we seek. Bringing these technologies to the attention of the academic and business communities for use, further testing, and development in new directions is an area of critical national need.
Chapter
Full-text available
There is nothing wrong with the NAEP reading exercises, the sampling design, or the NAEP Reading Proficiency Scale, these authors maintain. But adding a rich criterion-based frame of reference to the scale should yield an even more useful tool for shaping U.S. educational policy.
Chapter
Full-text available
Measurement plays a vital role in the creation of markets, one that hinges on efficiencies gained via universal availability of precise and accurate information on product quantity and quality. Fulfilling the potential of these ideals requires close attention to measurement and the role of technology in science and the economy. The practical value of a strong theory of instrument calibration and metrological traceability stems from the capacity to mediate relationships in ways that align, coordinate, and integrate different firms’ expectations, investments, and capital budgeting decisions over the long term. Improvements in the measurement of reading ability exhibit patterns analogous to Moore’s Law, which has guided expectations in the micro-processor industry for almost 50 years. The state of the art in reading measurement serves as a model for generalizing the mediating role of instruments in making markets for other forms of intangible assets. These remarks provide only a preliminary sketch of the kinds of information that are both available and needed for making more efficient markets for human, social, and natural capital. Nevertheless, these initial steps project new horizons in the arts and sciences of measuring and managing intangible assets.
Chapter
Full-text available
Implicit in the idea of measurement is the concept of objectivity. When we measure the temperatureTemperature using a thermometer, we assume that the measurement we obtain is not dependent on the conditions of measurement, such as which thermometer we use. Any functioning thermometer should give us the same reading of, for example, 75 °F. If one thermometer measured 40 °, another 250 and a third 150, then the lack of objectivity would invalidate the very idea of accurately measuring temperatureTemperature.
Chapter
Full-text available
This paper presents and illustrates a novel methodology, construct-specification equations, for examining the construct validity of a psychological instrument. Whereas traditional approaches have focused on the study of between-person variation on the construct, the suggested methodology emphasizes study of the relationships between item characteristics and item scores. The major thesis of the construct-specification-equation approach is that until developers of a psychological instrument understand what item characteristics are determining the item difficulties, the understanding of what is being measured is unsatisfyingly primitive. This method is illustrated with data from the Knox Cube Test which purports to be a measure of visual attention and short-term memory.
Chapter
Full-text available
The last 50 years of human and social science measurement theory and practice have witnessed a steady retreat from physical science as the canonical model. Humphry (2011) unapologetically draws on metrology and physical science analogies to reformulate the relationship between discrimination and the unit. This brief note focuses on why this reformulation is important and on how these ideas can improve measurement theory and practice.
Chapter
Full-text available
Huge resources are invested in metrology and standards in the natural sciences, engineering, and across a wide range of commercial technologies. Significant positive returns of human, social, environmental, and economic value on these investments have been sustained for decades. Proven methods for calibrating test and survey instruments in linear units are readily available, as are data- and theory-based methods for equating those instruments to a shared unit. Using these methods, metrological traceability is obtained in a variety of commercially available elementary and secondary English and Spanish language reading education programs in the U.S., Canada, Mexico, and Australia. Given established historical patterns, widespread routine reproduction of predicted text-based and instructional effects expressed in a common language and shared frame of reference may lead to significant developments in theory and practice. Opportunities for systematic implementations of teacher-driven lean thinking and continuous quality improvement methods may be of particular interest and value.
Chapter
Full-text available
In his classic paper entitled “The Unreasonable Effectiveness of Mathematics in the Natural Sciences,” Eugene Wigner addresses the question of why the language of Mathematics should prove so remarkably effective in the physical [natural] sciences. He marvels that “the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and that there is no rational explanation for it.” We have been similarly struck by the outsized benefits that theory based instrument calibrations convey on the natural sciences, in contrast with the almost universal practice in the social sciences of using data to calibrate instrumentation.
Chapter
Full-text available
Several concepts from Georg Rasch's last papers are discussed. The key one is comparison because Rasch considered the method of comparison fundamental to science. From the role of comparison stems scientific inference made operational by a properly developed frame of reference producing specific objectivity. The exact specifications Rasch outlined for making comparisons are explicated from quotes, and the role of causality derived from making comparisons is also examined. Understanding causality has implications for what can and cannot be produced via Rasch measurement. His simple examples were instructive, but the implications are far reaching upon first establishing the key role of comparison.
Chapter
Full-text available
The purpose of this paper is to review some assumptions underlying the use of norm-referenced tests in educational evaluations and to provide a prospectus for research on these assumptions as well as other questions related to norm-referenced tests. Specifically, the assumptions which will be examined are (1) expressing treatment effects in a standard score metric permits aggregation of effects across grades, (2) commonly used standardized tests are sufficiently comparable to permit aggregation of results across tests, and (3) the summer loss observed in Title I projects is due to an actual loss in achievement skills and knowledge. We wish to emphasize at the outset that our intent in this paper is to raise questions and not to present a coherent set of answers.
Chapter
Full-text available
Rasch’s unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Chapter
Full-text available
Does the reader comprehend the text because the reader is able or because the text is easy? Localizing the cause of comprehension in either the reader or the text is fraught with contradictions. A proposed solution uses a Rasch equation to models comprehension as the difference between a reader measure and text measure. Computing such a difference requires that reader and text are measured on a common scale. Thus, the puzzle is solved by positing a single continuum along which texts and readers can be conjointly ordered. A reader’s comprehension of a text is a function of the difference between reader ability and text readability. This solution forces recognition that generalizations about reader performance can be text independent (reader ability) or text dependent (comprehension). The article explores how reader ability and text readability can be measured on a single continuum, and the implications that this formulation holds for reading theory, the teaching of reading, and the testing of reading.
Chapter
Full-text available
This paper describes Mapping Variables, the principal technique for planning and constructing a test or rating instrument. A variable map is also useful for interpreting results. Modest reference is made to the history of mapping leading to its importance in psychometrics. Several maps are given to show the importance and value of mapping a variable by person and item data. The need for critical appraisal of maps is also stressed.
Chapter
Full-text available
A construct theory is the story we tell about what it means to move up and down the scale for a variable of interest (e.g., temperature, reading ability, short term memory). Why is it, for example, that items are ordered as they are on the item map? The story evolves as knowledge regarding the construct increases. We call both the process and the product of this evolutionary unfolding "construct definition" (Stenner et al., Journal of Educational Measurement 20:305–316, 1983). Advanced stages of construct definition are characterized by calibration equations (or specification equations) that operationalize and formalize a construct theory. These equations, make point predictions about item behavior or item ensemble distributions. The more closely theoretical calibrations coincide with empirical item difficulties, the more useful the construct theory and the more interesting the story. Twenty-five years of experience in developing the Lexile Framework for Reading enable us to distinguish five stages of thinking. Each subsequent stage can be characterized by an increasingly sophisticated use of substantive theory. Evidence that a construct theory and its associated technologies have reached a given stage or level can be found in the artifacts, instruments, and social networks that are realized at each level.
Chapter
Full-text available
The process of ascribing meaning to scores produced by a measurement procedure is generally recognized as the most important task in developing an educational or psychological measure, be it an achievement test, interest inventory, or personality scale. This process, which is commonly referred to as construct validation (Cronbach, 1971; Cronbach & Meehl, 1955; ETS, 1979; Messick, 1975, 1980), involves a family of methods and procedures for assessing the degree to which a test measures a trait or theoretical construct.
Chapter
Full-text available
Teachers make use of these two premises to match readers to text. Knowing a lot about text is helpful because “text matters” (Hiebert, 1998). But ordering or leveling text is only half the equation. We must also assess the level of the readers. These two activities are necessary so that the right books can be matched to the right reader at the right time. When teachers achieve this match intuitively, they are rewarded with students choosing to read more.
Chapter
Full-text available
The International Vocabulary of Measurement (VIM) and the Guide to Uncertainty in Measurement (GUM) shift the terms and concepts of measurement information quality away from an Error Approach toward a model-based Uncertainty Approach. An analogous shift has taken place in psychometrics with the decreasing use of True Score Theory and increasing attention to probabilistic models for unidimensional measurement. These corresponding shifts emerge from shared roots in cognitive processes common across the sciences and they point toward new opportunities for an art and science of living complex adaptive systems. The psychology of model-based reasoning sets the stage for not just a new consensus on measurement and uncertainty, and not just for a new valuation of the scientific status of psychology and the social sciences, but for an appreciation of how to harness the energy of self-organizing processes in ways that harmonize human relationships.
ResearchGate has not been able to resolve any references for this publication.