Science topic

History and Philosophy of Science - Science topic

Explore the latest questions and answers in History and Philosophy of Science, and find History and Philosophy of Science experts.
Questions related to History and Philosophy of Science
  • asked a question related to History and Philosophy of Science
Question
15 answers
We are looking for volunteer translators who could translate 19th century German texts to English.
- Do you read 19th century Fraktur German?
- Your task would be translating 1,000 to 16,000 word texts from German to English.
If you are interested please send me a message.
Relevant answer
Answer
Yes. Please update me concerning the project
  • asked a question related to History and Philosophy of Science
Question
19 answers
One of the central themes in the philosophy of formal sciences (or mathematics) is the debate between realism (sometimes misnamed Platonism) and nominalism (also called "anti-realism"), which has different versions.
In my opinion, what is decisive in this regard is the position adopted on the question of whether objects postulated by the theories of the formal sciences (such as the arithmetic of natural numbers) have some mode of existence independently of the language that we humans use to refer to them; that is, independently of linguistic representations and theories. The affirmative answer assumes that things like numbers or the golden ratio are genuine discoveries, while the negative one understands that numbers are not discoveries but human inventions, they are not entities but mere referents of a language whose postulation has been useful for various purposes.
However, it does not occur to me how an anti-realist or nominalist position can respond to these two realist arguments in philosophy of mathematics: first, if numbers have no existence independently of language, how can one explain the metaphysical difference, which we call numerical, at a time before the existence of humans in which at t0 there was in a certain space-time region what we call two dinosaurs and then at t1 what we call three dinosaurs? That seems to be a real metaphysical difference in the sense in which we use the word "numerical", and it does not even require human language, which suggests that number, quantities, etc., seem to be included in the very idea of ​​an individual entity.
Secondly, if the so-called golden ratio (also represented as the golden number and related to the Fibonacci sequence) is a human invention, how can it be explained that this relationship exists in various manifestations of nature such as the shell of certain mollusks, the florets of sunflowers, waves, the structure of galaxies, the spiral of DNA, etc.? That seems to be a discovery and not an invention, a genuine mathematical discovery. And if it is, it seems something like a universal of which those examples are particular cases, perhaps in a Platonic-like sense, which seems to suggest that mathematical entities express characteristics of the spatio-temporal world. However, this form of mathematical realism does not seem compatible with the version that maintains that the entities that mathematical theories talk about exist outside of spacetime. That is to say, if mathematical objects bear to physical and natural objects the relationship that the golden ratio bears to those mentioned, then it seems that there must be a true geometry and that, ultimately, mathematical entities are not as far out of space-time as has been suggested. After all, not everything that exists in spacetime has to be material, as the social sciences well know, that refer to norms, values or attitudes that are not. (I apologize for using a translator. Thank you.)
Relevant answer
Answer
Indeed, that is a possibility. Perhaps what we call numbers are labels in a language, as a kind of names that do not really name anything that is literally beyond human language and representations, or that are a way of referring to systems, scales , etc. of which they are a part, mere nodes of a conceptual structure. Some authors have argued that numbers are only signs, signs that are part of representational and notational systems that have proven to be effective, useful instruments to be applied to parts of reality, which are improved and refined over time. However, I believe that it is necessary to take into account the fact that not every system, model or scale works, and this perhaps reveals that there are structural characteristics of the reality to which they are applied that are imposed as limits, that constrain what can be work and what doesn't, and this perhaps means that, although they do not literally describe abstract entities (numbers or geometric figures, for example) as we imagine them, mathematical systems and theories somehow express that which is beyond the representations themselves. You can't use just any geometry to build a house or to explain why Mercury "wobbles" when it's at perihelion, and that suggests that mathematical systems, mathematized theories and models are human creations but they could not be totally arbitrary, so that, even in a metaphorical or indirect way, it should not be ruled out that they represent structural characteristics of the world to which they are applied that is beyond human constructs. We must not forget that we humans perceive in three dimensions, we listen less than dogs, we believe that colors are in things and, to an important extent, we elaborate our theories and build our image of the world accordingly ("the human is the measure of all things" said Protagoras), but there seems to be more and more evidence that, at least the macroscopic physical world is not three-dimensional, so we may never really know what lies beyond us and our representations and to that mystery we must add that of why some models and mathematical theories work and others do not. Greetings.
  • asked a question related to History and Philosophy of Science
Question
3 answers
We have learned in QM the famous U. Principle which is probably the most important thing in this branch.
We also have learned that space-time stays together in GR.
The problem of measurements in QM comes from U. Principle & vice-versa and why it is not present in GR, not in the same form but analog?
Thanks
Relevant answer
Answer
The answer of Mark Kristian van der Pals is – in my opinion – correct (Planck’s constant and Heisenberg’s uncertainty principle are like the 2 sides of a coin).
The quantization of energy is not in line with the theoretical properties of the electromagnetic field (electric field and corresponding magnetic field). Because the electric field shows to be a topological field – responsible for the wave-like nature of the phenomena in the microcosm – and the magnetic field is a pure vector field. The topological deformation of a field structure is only possible within a continuum.
The consequence is that the quantum of energy and Heisenberg’s uncertainty principle are not basic properties of space itself, but induced properties (caused by the basic properties).
With kind regards, Sydney
  • asked a question related to History and Philosophy of Science
Question
5 answers
is there a historical map of academic disciplines? what is the trend of academic disciplines changes(number, nature and label of disciplines)?
i will be thanks full if someone introduce any article, book, handbook or report about historical map of disciplines and history of academic disciplines.
Relevant answer
Answer
Hello,
Yes, a historical mapping of academic disciplines now exists: the "Interactive Historical Atlas of the Disciplines". This website, recently launched at the University of Geneva, is available in open access here:
It is an interactive atlas containing a collection of more than 200 disciplinary maps ("classifications of the sciences" or "knowledge maps") from Antiquity to our time, with thousands historical definitions of academic disciplines extracted from sources. Moreover, it includes several analysis tools (timeline, statistical tools with the ability to chart the evolution of a discipline, iconographic database, advanced search filters). For instance, it is possible to display chronologically a list of historical definitions of an academic discipline in order to study the evolution of its identity over time. The aim of this project is to map the evolution of the disciplinary borders throughout the centuries, and to reconstruct the genealogy of the sciences.
As for a "typology" of the various types of disciplinary systems (namely, the different taxonomic systems underlying these historical "classifications of the sciences"), you could find some insights in my recent paper:
Sandoz, Raphaël (2021), "Thematic Reclassifications and Emerging Sciences", Journal for General Philosophy of Science 52(1), pp. 63–85, available here: https://doi.org/10.1007/s10838-020-09526-2.
Best regards
  • asked a question related to History and Philosophy of Science
Question
29 answers
What kind of scientific research dominate in the field of Philosophy of science and research?
Please, provide your suggestions for a question, problem or research thesis in the issues: Philosophy of science and research.
Please reply.
I invite you to the discussion
Thank you very much
Best wishes
Relevant answer
Answer
Problem of what counts as a good scientific explanation... Salmon, W. C. (1984). Scientific explanation and the causal structure of the world. Princeton University Press.
  • asked a question related to History and Philosophy of Science
Question
219 answers
1) There is some tradition in philosophy of mathematics starting at the late 19th century and culminating in the crisis of foundations at the beginning of the 20th century. Names here are Zermelo, Frege, Whitehead and Russel, Cantor, Brouwer, Hilbert, Gödel, Cavaillès, and some more. At that time mathematics was already focused on itself, separated from general rationalist philosophy and epistemology, from a philosophy of the cosmos and the spirit.
2) Stepping backwards in time we have the great “rationalist” philosophers of the 17th, 18th, 19th century: Descartes, Leibniz, Malebranche, Spinoza, Hegel proposing a global view of the universe in which the subject, trying to understand his situation, is immersed.
3) Still making a big step backwards in time, we have the philosophers of the late antiquity and the beginning of our era (Greek philosophy, Neoplatonist schools, oriental philosophies). These should not be left out from our considerations.
4) Returning to the late 20th century we see inside mathematics appears the foundation (Eilenberg, Lavwere, Grothendieck, Maclane,…) of Category theory, which is in some sense a transversal theory inside mathematics. Among its basic principles are the notions of object, arrow, functor, on which then are founded adjunctions, (co-)limits, monads, and more evolved concepts.
Do you think these principles have their signification a) for science b) the rationalist philosophies we described before, and ultimately c) for more general philosophies of the cosmos?
Examples: The existence of an adjunction of two functors could have a meaning in physics e.g.. The existence of a natural numbers - object known from topos theory could have philosophical consequences. (cf. Immanuel Kant, Antinomien der reinen Vernunft).
Relevant answer
Answer
There is a view that if mathematical categories are kinds of mathematical structure, then what is important mathematically are the functors from one category to another, because they provide a means of find a neat way of discovering a new property in a category by translating proofs in another category. This is a way of formalising reasoning by "analogy". Personally I find reasoning about categories as abstract algebras difficult and unintuitive, and find it much easier to look at a concrete realisation of a category than considering a category with a list of pre-defined desirable properties; but I recognise that that is a matter of learning preferences.
  • asked a question related to History and Philosophy of Science
Question
35 answers
Is "Quantization of Time" theory possible ?
According to science Time is a physical parameter but according to philosphy it is an illusion . How can we define Time ? Can we quantize illusions?
Relevant answer
Answer
What about George Musser's understandings: I will not go into detail as it would take too long.
Using a procedure called canonical quantization to turn Theory of Relativity into a quantum theory physicists found it worked with the theory of electromagnetism, but in relativity produces an equation. The Wheeler-DeWitt equation, which does not have a Time variable. This equation predicts that the universe is frozen.
Some think that this challengers the observer principle. Two observers will have a different perception of spacetime, perceived geometrically, that concerns who is moving and forces acting. Normally, logically, whatever the shapes they should be physically equivalent.
This then involved or involves substantivalism-that is space and time independent of stars, etc. Or is time and space artificial methods of talking about relationism? (The illusion theory).Apparently, Einstein gave thought to an independent space/time, which I and others here have also promulgated. Einstein rejected this because like quantum theory it contains an element of randomness. An independent spacetime doesn't fit the deterministic approach that Einstein approved of, therefore he abandoned it.
So relationism sees time/space, but really time, as an illusion.
George Musser: Spooky Action at a Distance (2015).
  • asked a question related to History and Philosophy of Science
Question
164 answers
Hawking's Legacy
Black hole thermodynamics and the Zeroth Law [1,2].
(a) black hole temperature: TH = hc3/16π2GkM
The LHS is intensive but the RHS is not intensive; therefore a violation of thermodynamics [1,2].
(b) black hole entropy: S = πkc3A/2hG
The LHS is extensive but the RHS is neither intensive nor extensive; therefore a violation of thermodynamics [1,2].
(c) Black holes do not exist [1-3].
Hawking leaves nothing of value to science.
REFERENCES
[1] Robitaille, P.-M., Hawking Radiation: A Violation of the Zeroth Law of Thermodynamics, American Physical Society (ABSTRACT), March, 2018, http://meetings.aps.org/Meeting/NES18/Session/D01.3
[2] Robitaille, P.-M., Hawking Radiation: A Violation of the Zeroth Law of Thermodynamics, American Physical Society (SLIDE PRESENTATION), March, 2018, http://vixra.org/pdf/1803.0264v1.pdf
[3] Crothers, S.J., A Critical Analysis of LIGO's Recent Detection of Gravitational Waves Caused by Merging Black Holes, Hadronic Journal, n.3, Vol. 39, 2016, pp.271-302, http://vixra.org/pdf/1603.0127v5.pdf
Relevant answer
Answer
Well, to put it on a more concrete foundation, here's my view on his scientific achievement, not exhaustive, as I don't think I am entitled to judge on Hawking's scientific legacy.
His works on black hole theory are from about 50 years ago, and I would consider the singularity theorems he proved together with Roger Penrose quite the highlight of his scientific career. In a nutshell, what they say is that black hole creation takes place under very general conditions in space-time and is a necessary consequence of ART, and does not require very special, e.g. highly symmetric conditions.
With his work on Hawking radiation from teh mid-70s he applied semiclassical analysis to ART which paved the way to a more thorough treatment of quantum field theory on curved space/spacetime.
Although his scientific highlights might stem back from the 60s and 70s, I would nevertheless stress that his legacy surely comprises all that he did as an ambassador to science, as it seems. He surely was someone who gave inspiration to at least a complete generation of scientists many man years ago, his publicity starting to spread with the little booklet he wrote end of the 80s: "A Brief History of Time". I would never underestimate the importance of lighthouse figures like him with this regards, even though the hard-core scientific hightime had then already been past.
  • asked a question related to History and Philosophy of Science
Question
27 answers
I have posted a comment on André Orléan's "open letter" to the French Minister of Education (See the first answer below of my own). The letter and the comments on background explain what is happening in France in the field of economics education. In the comment, I have mentioned what had happened in Japan. An e-mail I have received this morning tells that similar dispute is repeated in University College London.
At the bottom of all arguments, there lies a problem how to interpret the status of neoclassical economics. The neoclassical economics occupies now a mainstream position and is trying to monopolize the economics education and academic posts, whereas various heterodox economists are resisting to the current, claiming the necessity of pluralism in economics education and research.
I have mentioned cases of three countries. There must be many similar stories in almost all countries. It would be wonderful if we can know all what is happening in other countries. So my question is:
What is happening in your country?
Relevant answer
Answer
The following is a comment I have added to André Orléan's new uploaded article: Madame Najat VALLAUD-BELKACEM Souhaitez-vous vraiment la fin du pluralisme en économie ?
********************************************
This is an important letter. If this is an open letter, I wonder why this letter is closed to a third party. At least, André Orléan should indicate where we can get this important petition. I have read this Open Letter at
This is a petition from André Orléan, as president of President de l’AFEP (Associassion Française d'Economie Politique) to the French Minister of Education Mme Najat Vallaud-Belkacem asking to conserve (or create) the pluralism in economic education and research. Almost all countries are facing a strong movement that claims to "modernize" or "standardize" or "make more efficient" the economics education and thus to establish a more unified and uniform economics in education and research.
The origin of this movement may be traced back to various reasons. Each nation must have its own history.
In the case of Japan, a controversy started when the Japan Science Council established Working Group of Economics Section for Preparing the Standard References for the Economics Education at the end of 2012. A first draft of the Standard References was made open in June, 2013 and ignited a feverish argument among economists. The first draft was written totally in the neoclassical economics view. It cited Lionel Robins's the famous definition of economics: "Economics is the science which studies human behaviour as a relationship between ends and scarce means which have alternative uses." (Robbins, 1932,An Essay on the Nature and Significance of Economic Science, p. 15).
Many academic associations including Japan Association of Political Economy (JAPE), Japan Association for Evolutionary Economics (JAFEE), Japan Association for Economics Education (JAEE), Socio-Economic History Society Japan (SEHSJ), Japanese Society for the History of Economic Thought (JSHET) and 7 other associations expressed their concern and objected to the first draft. An open symposium to discuss the question was organized in December 2014. The final version of the Standard References was decided and made public in August, 2015. In my understanding, the working group members conceded much and its expressions became more ambiguous and obscure, but what they aimed did not changed much. A book The Future of Economics and Economics Education (in Japanese) was published in April, 2015, which contained a summary of the controversy and papers of various opinions from the opponent side. I also contributed a chapter. The main point of dispute was how to evaluate plurality in economics education.
The case of France has a totally different history. Already in 2000, when the French Ministry of Education tried to set more uniform system of economics education in universities, a strong movement of economics students erupted and many students asked a more diversified education. Aided by researchers, they formed a movement they named Post-Autistic Economics. It was then re-organized as Real-World Economics and now one of active activity center of heterodox economics. In 2009, a group of social scientists including not only economists but sociologists organized an AFEP. This group asked the Ministry of Education to admit a new section, named Economics and Society, which was to be the 78th of the National University Committee (Comité National des Universités). To under stand the whole history it is necessary to know the French university system. It is highly centralized and very different from other countries where the university autonomy is much more established.
See for the background a document by AFEP and an article in a newspaper Libération:
Some contextual and background information relevant to understanding the issue
Battle for Influence among French economists (in French) by Frantz Durupt Februrary 1, 2015.
The Ministry was once prone to admit the creation of the new section. At that time Jean Tirole, the winner of 2014 Nobel prize in economics, tried to hinder the creation of new section in a letter to the Minister, pointing that selection of university researchers and professors should be made from the international standard.
See Jean Tirole's letter to the Minister
It took a form of thank you letter for the presence of the Minister at the Awarding Ceremony of Nobel Prizes. Tirole expressed his concern with regards to the creation of the new section that AFPE worked for years. Tirole pointed that the economics scientist community should have a world standard that is based on international reputation. He opposed that French government admit two different communities in the same discipline called economics. He even denounced AFPE economists as obscurantists. It was an explicit objection to the creation of the new section and the pluralist strategy in economics. André Orléan's paper is one of responses to Tirole's intervention.
I am not sure if it is good to create a new section Economics and Society in or at the side of economics, because it may help mainstream economics to keep its status quo. But it seems many French economists are thinking that this is the unique way to rescue heterodox economics from the actual dominance of economic sciences. The AFPE cites a statistical research result: Only 6 professors were affiliated to minority schools of thought among 120 professors appointed between 2005 and 2011. The crisis of economics is much more acute in France than in many other countries.
As I have put it above, the academic situation of economics is very different from country to country but we face a common situation in essence. What is the best way to design economics education in the undergraduate and graduate courses? This is the problem that all economists must think about. Supporters of pluralism and heterodox economics should explain that their claim is justified and necessary instead of being expression of their simple desire to keep their posts. Those economists who believe in the future of mainstream economics should show that, despite of all condemnations on their economics, it is not only sane and right but also the unique way to the future development of the economics and pluralism is only an unnecessary waste of intellectual resources.
This is not only an academic dispute inside of the ivory tower but an actual problem that will influence the future of our economy, because the economic policy is strongly influenced by the state of economic science and thought. In this sense, this is the problem that all policy makers and even common people in the town should be concerned with.
To think of this question requires a deep understanding of economics science and the real history of development of economics. On this point, I want to point @David Ellerman's paper: Parallel experimentation.
The main purpose of this paper is not the study of economics but I believe it gives us a necessary framework to consider how a science like economics evolves. It gives us a persuasive reasoning why we need pluralism in complex science as economics.
*******************************************
  • asked a question related to History and Philosophy of Science
Question
65 answers
Why or why not?
Some philosophers maintain that science is morally neutral, while other philosophers maintain that science produces morality.
Relevant answer
Answer
Absolutely! I just finished opining that an occasional glass of wine is actually beneficial, as opposed to drinking alcohol being considered "a sin." And the reason for making such a bold statement is scientific evidence. As in the link attached here.
I would argue that nothing that provides health benefits can be considered "immoral." Abuse of anything, on the other hand, is detrimental to health, so a culture may be justified in making such activity "immoral" in their code of ethics.
Plenty of references in the New Testament of "non-sinful" wine consumption, and science can explain why. And no, gimme a break, that wasn't grape juice!
  • asked a question related to History and Philosophy of Science
Question
6 answers
Or at least use the sentence waves above waves. If you can provide the source that would be great.
Relevant answer
Answer
1) Not exactly an internal wave, but the two-directional current system in the Bosphorus has been known for centuries.  Surface waters flow from the Black Sea to the Sea of Marmara, and bottom (much more saline) waters flow from the Marmara to the Black Sea at the same time.  Fishermen wanting a "free ride" to the Black Sea against the surface current would lower their nets to catch the lower flow.
2)  Fridjtof Nansen had a research ship named the "Fram", which may be the one that Dennis Mazur is referring to above.
  • asked a question related to History and Philosophy of Science
Question
23 answers
Einstein’s geometrodynamics considers 4-D spacetime geometry whose curvature is governed by mass. But the FLRW universe considers a 3-D space of curvature k (+ve, zero, or –ve) with time as an orthogonal coordinate. Hence it seems, the standard cosmology based on the FLRW space time tracked off the stated essence of general relativity.
Relevant answer
Answer
Of course not-and the statement made about the FLRW metric is incorrect-it describes a Lorentzian manifold, not just a  three-dimensional manifold. And the spacetime geometry isn't defined just by the spatial integral of the time-time component of the energy-momentum tensor, but by all components thereof. 
The FLRW metric is written using  a particular choice of coordinates, that's all. What is of relevance isn't the metric, which transforms in a particular way under general coordinate transformations, however, but quantities that are invariant under such transformations-it can be shown that such quantities exist. All this is well known, described in all textbooks and courses on general relativity and doesn't have anything to do with the physical applications of the FLRW metric, as a particular solution of Einstein's equations. It's the other way around-since it can be shown that it is, indeed, a well-defined solution of these equations, i.e. that it does satisfy equations of motion and constraints, it makes sense to study its implications for physics, in this case, cosmology. So it would be useful to study a textbook on general relativity.
And the cosmology in question is classical, not quantum.
  • asked a question related to History and Philosophy of Science
Question
2 answers
Between the end of XIX Century and the beginning of XX there was a French teacher in Macau, he (or she) was theaching art at the Academy of natual science (格致书院). Somebody knows how was?
Thanks a lot!
Relevant answer
Answer
 No record it seems. Only one Mai La Finnish-American actress is in reference
  • asked a question related to History and Philosophy of Science
Question
63 answers
Schrödinger self adjoint operator H is crucial for the current quantum model of the hydrogen atom. It essentially specifies the stationary states and energies. Then there is Schrödinger unitary evolution equation that tells how states change with time. In this evolution equation the same operator H appears. Thus, H provides the "motionless" states, H gives the energies of these motionless states, and H is inserted in a unitary law of movement.
But this unitary evolution fails to explain or predict the physical transitions that occur between stationary states. Therefore, to fill the gap, the probabilistic interpretation of states was introduced. We then have two very different evolution laws. One is the deterministic unitary equation, and the other consists of random jumps between stationary states. The jumps openly violate the unitary evolution, and the unitary evolution does not allow the jumps. But both are simultaneously accepted by Quantism, creating a most uncomfortable state of affairs.
And what if the quantum evolution equation is plainly wrong? Perhaps there are alternative manners to use H.
Imagine a model, or theory, where the stationary states and energies remain the very same specified by H, but with a different (from the unitary) continuous evolution, and where an initial stationary state evolves in a deterministic manner into a final stationary state, with energy being continuously absorbed and radiated between the stationary energy levels. In this natural theory there is no use, nor need, for a probabilistic interpretation. The natural model for the hydrogen, comprising a space of states, energy observable and evolution equation is explained in
My question is: With this natural theory of atoms already elaborated, what are the chances for its acceptance by mainstream Physics.
Professional scientists, in particular physicists and chemists, are well versed in the history of science, and modern communication hastens the diffusion of knowledge. Nevertheless important scientific changes seem to require a lengthy processes including the disappearance of most leaders, as was noted by Max Planck: "They are not convinced, they die".
Scientists seem particularly conservative and incapable of admitting that their viewpoints are mistaken, as was the case time ago with flat Earth, Geocentrism, phlogiston, and other scientific misconceptions.
Relevant answer
Answer
Hello Enders
You state that "According to Schrödinger 1926, there are no quantum jumps." Please allow me the following comments.
A set of articles by various authors are collected in a book edited by Wolfgang Pauli
Pauli, W. (ed.) - Niels Bohr and the Development of Physics. Pergamon Press, London. 1955.
Among the articles there is one by Werner Heisenberg
The Development of the Interpretation of the Quantum Theory
The following lines can be found in the article (page 14 of the book)
At the invitation of Bohr, Schrodinger visited Copenhagen in September, 1926, to lecture on wave mechanics. Long discussions, lasting several days, then took place concerning the foundations of quantum theory, in which Schrodinger was able to give a convincing picture of the new simple ideas of wave mechanics, while Bohr explained to him that not even Planck's Law could be understood without the quantum jumps. Schrodinger finally exclaimed in despair:
"If we are going to stick to this damned quantum-jumping [verdammte Quantenspringerei], then I regret that I ever had anything to do with quantum theory,"
to which Bohr replied:
"But the rest of us are thankful that you did, because you have contributed so much to the clarification of the quantum theory."
May be the above paragraph is the ultimate source of your statement.
The displeasure shown by Schrodinger has a different interpretation. It may mean that he understood quantum jumps, that he had a clear picture of the reach of the Schrodinger time dependent equation (STDE), and in particular that STDE contradicted quantum jumps. Therefore he knew that something very fundamental was missing in his elegant STDE. Nowhere he said something equivalent to "quantum jumps do not exist". He was annoyed by having to accept the existence and crucial phenomenological role of quantum jumps for the description of the basic atomic phenomena of absorption and radiation.
If you have a different historical source to justify your interpretation please share with us the reference as it would be extremely interesting
With most cordial regards,
Daniel Crespin
  • asked a question related to History and Philosophy of Science
Question
25 answers
[I had heard of the Know-Nothing Party, but apparently the internet tells me that that was a disclaimer used by members of what became the American Party, which was anti-immigration in the mid-nineteenth century ... another area of discussion, though proponents today may often fall into the category for discussion here as well, but that is still a bit out-of-scope for this discussion.]     
For historians and other history buffs out there, and those interested in current events, what do you see as the path that has been taken to arrive at popular anti-intellectual, anti-science views in politics?  The rejection of some members of the US House of Representatives with regard to correction of (US) census undercounts - the rejection of sampling statistics - comes to mind, in addition to the usual comments on climate change.
And are there any similar anti-intellectualism movements to be found in history anywhere in the world, including ancient history, which anyone would care to share?   Can you draw any parallels? 
Reasoned comments and historical evidence are requested.  I do not intend to make further comments but instead wish to hear what applicable history lessons you may find interesting regarding this topic.
Thank you. 
Relevant answer
Answer
Such is the cost of development in highly industrialized countries - the lowering of the average level of public education and ethical standards, consumer and selfish lifestyle, alienation of individuals and social groups, public hypocrisy, callousness society, "the rat race", the powerful role of money in politics and everyday life.
  • asked a question related to History and Philosophy of Science
Question
126 answers
Has the experimental science got limits in its discipline? Many actual knowledges are not consequence of repetitive experiments. Regarding the sources of science, are they limited to experimentation? Other disciplines as history, unique experiences, philosophy, etc., can they be more important for the man?
Relevant answer
Answer
The theory of General Relativity was inspired by pure imagination (affected by philosophy) followed by mathematical formulation and then experimental validation.
  • asked a question related to History and Philosophy of Science
Question
6 answers
In my studies many years ago, i came across the very influential thinker alexander bain. Most of his ideas are obsolete today, i know, but he was still an extremely influential person. I skimmed through his autobiography once, but i could not find any study on him by a modern scholar which could place him in a historical perspective. I thought this was odd, considering who he was. 
Does anyone know if there are any standard works on bain? It didn't pop up on amazon.
Relevant answer
Answer
Bain is a central figure in the following work, but I think it focuses on his work and not his life:
Rylance, Rick.
Victorian psychology and British culture, 1850-1880 / Rick Rylance.
Oxford ; New York : Oxford University Press, 2000.
  • asked a question related to History and Philosophy of Science
Question
16 answers
Language, as an expression of the various 'knowledge' is subject to continuous transformations. I’d like to focus in particular on one of them in the field of scientific research.
As science can not critically verify its own assumptions, it is up to history, epistemology, philosophy and to the analysis of language to deepen the horizons of pre-understanding of each scientific proposition. In particular this is the understanding of a reality based on the assumption and tradition of antecedent interpretations, which precedes the direct experience of reality itself.
Popper was very attentive about  the instrumental aspect of science (and therefore also to language), not interested in things in themselves, but to their verifiable aspects through measurements. Therefore, he invited not to interpret theories as descriptions or using their results in practical applications. He recalled that, as "knowledge", science is nothing but a set of conjectures or highly informative guesses about the world, which, although not verifiable (i.e. such that it is possible to demonstrate the truth) they can be subjected to strict critical controls.
This is evident from various texts and  Popper emphasized these ideas in ‘The Logic of Scientific Discovery’: "Science is not a system of certain assertions, or established once and for all, nor is it a system that progresses steadily towards a definitive state. Our science is not knowledge (episteme): it can never claim to have reached the truth, not even a substitute for the truth, as probability .... "
We do not know, we can only presume. Our attempts to conceit are guided by the unscientific belief, metaphysical in the laws, in the regularities that we can uncover, discover.
A kind of approach which is not exempt from ethical questions because the operation has fluid boundaries. The borders can be crossed, leading to the possibility of manipulation and abuse of power against the same identity and autonomy of the persons involved.
As Bacon we could describe our contemporary science - the method of reasoning that today men routinely apply to Nature - consisting of hasty advances, premature and of prejudices. But, once advanced, none of our advances is supported dogmatically. Our research method is not what is to defend them, to prove how right we were; on the contrary, we try to subvert them, using all the tools of our logical, mathematical and technical ‘baggage’".
Hence the maximal caution: "The old scientific ideal of episteme, of absolutely certain and demonstrable knowledge, has proved an idol.
The need for scientific objectivity makes it inevitable that every assertion of Science remains necessarily and forever to the status of an attempt. The wrong view of science is betrayed because of its desire to be the right one. Since it is not the possession of knowledge, of irrefutable truth, that makes a man of science, but the critical research, persistent and anxious for the truth ".
[In this regard I consulted the following texts: H. R. Schlette, Philosophie, Theologie, Ideologies. Erläuterung der Differenzen, Cologne, 1968 (Italian transl c / o Morcelliana, Brescia, 1970, pp. 56, 78); G. Gismondi, The critique of ideology in the science foundation's speech, in "Relata Technica", 4 (1972), 145-156; Id., Criticism and ethics in scientific research, Marietti, Torino, 1978].
Then, Hermeneutics, applied to language, to human action and ethics allows to articulate text and action. An action may be told because it is the human life itself that deserves to be narrated; it presents possible narrative paths that the individual highlights, excluding others. Story and action also confirm the inter-subjectivity dimension of human beings: the action can be told because it is the same human life that deserves to be told. The story presents thoroughly the three moments of ethical reflection: describe, tell and prescribe.
Relevant answer
Answer
I just put my book up on Amazon: Give Space My Love: An Intellectual Odyssey with Dr. Stephen Hawking. The brief book description is below.
If any of you would like a complementary copy of the book just send me an email with your physical address. bristol at isepp.org
Per your starting question it focuses of whether science and talk about science. My background is philosophy of science: Popper, Lakatos and Feyerabend. The last was my honors advisor at Berkeley.
The central narrative tension uses Dewey distinction between the Spectator and the Participant representation of inquiry (and the place of inquiry in the universe). Quantum Mechanics and Relativity both force us to (toward) a Participant framework. I argue that there were two paths to complementarity (and the limits of the scientific research program) in the 20th century, one is in the new physics and the other comes from Popper's Question (about falsifiability of all meaningful theories).
Personally I have transitioned from philosophy of science to philosophy of engineering – 'a new name for an old way of thinking' (viz. James's remark about pragmatism). I have a couple of Linus Pauling Memorial Lectures on YouTube if you are curious about where all this goes beyond the book. 
Freewill and the Engineering Worldview
Bristol May 3rd, 2013  http://youtu.be/kZjJukntqHM
Life Ascendant: A Post-Darwinian Worldview
Bristol May 21st  2014  http://youtu.be/i2mwhk-6a3A
What is Engineering? What is the Value Context of Engineering?
What is the Value Context of Engineering?
Bristol July 30th 2015 (China) https://www.youtube.com/watch?v=vc1lI8Ox7qM
BOOK DESCRIPTION:
Who is the real Dr. Stephen Hawking? Is he a detached Spectator seeking a mathematical description of a deterministic, objective reality – ‘out there’? Or is he an embodied Participant in the universe seeking to bring about a more desirable future? The timeline of the book is a four-city lecture tour the author organized for Hawking in the early 1990s (Portland, Eugene, Seattle and Vancouver BC). Hawking’s powerful meetings with students with disabilities, officially collateral events, were remarkable. However, the greater significance of these ‘stories of the road’ is better appreciated in the context of the central narrative question of the book: the nature of the universe and our place/role in it.
The author, a philosopher of science (Berkeley, London), engages Hawking, his graduate assistants and eventually his nurses in what starts as a critical review of the ‘new physics’ of Einstein, Bohr and Heisenberg. The question of the limits of classical science expands to questions of the limits of all supposedly objectivist, ‘one right answer’ ideologies – in biological, socio-economic, and political realms. Is everyone ‘really’ selfish? Is the world objectively competitive or cooperative? In a parallel critical review of the ‘new philosophy of science’ the contributions of the author’s mentors, Feyerabend, Lakatos, Kuhn and Popper mark a parallel path to complementarity, undermining the Spectator representation of detached ‘objective’ inquiry.
Through his personal interactions Hawking reveals himself as a Participant, concerned with ‘how we should live’. He steers us toward a more desirable, moral future.
The new post-scientific Participant understanding of the universe requires a paradigm shift to a More General Theory that can both explain the successes of science and yet understand them in a new way. In the More General Theory, our embodied Participant inquiry is understood in a new way wherein the sciences and the humanities are necessarily re-unified.
  • asked a question related to History and Philosophy of Science
Question
4 answers
In his 1963 book "little science, big science" Derek de Solla Price shows science as aa whole been growing exponentially for 400 years. He hypothesises this to be the first part of a logistic curve. If his predictions were right the growth of science should have been started to decline by now. Are there recent measurements that can be compared to his 1963 estimates? And... was he right?
Relevant answer
Answer
As differentiation is vital for growth, even in a complex system such as the global scientific web, there is evidence of convergence spanning the period 1993-2008, that seems to suggest stagnation is underway globally and in some science portfolios, if we assume a logistic growth function.
  • asked a question related to History and Philosophy of Science
Question
75 answers
Through many discussions in RearchGate, I came to recognize that majority of economists are still deeply influenced by the Friedmanian methodology. An evidence is the fact that they take little care for the economic consistency and relevance of the model. They pay enormous time and efforts in "empirical studies" and discuss the result, but they rarely question if the basic theory on which their model lies is sensible. This ubiquitous tendency gives grave effects in economics: neglect of theory and indulgence in empirics. I wonder why people do not argue this state of economics. Economic science should take back a more suitable balance between theory and empirics. 
It is clear that we should distinguish two levels of Friedmanian methodology.
 (1) Friedman's methodology and thought that is written in the texts, more specifically in his article The Methodology of Positive Economics (Chapter 7 of Essays in positive economics, 1953).
(2) The methodology that is believed to be Friedan's thought.
 Apparently, (2) is much more important for this question. I see dozens of papers that examines Friedmanian methodology based on his text. Many of them detect that widely spread understanding is not correctly reflecting Friedman's original message. They may be right, but what is important is the widely spread belief in the name of Milton Friedman.
Relevant answer
Dear Shiozawa sensei and ResearchGate community,
I could not agree more with you when you state that all data-first theorist like Hoover, Hendry, Juselius, Johansen, Spanos are deeply influenced by F53. In the end, all of them follow a marshallian approach. According to the four aspects of scientific research, they start from (3) and end up in (1). Regarding (3), it is necessary to recall that "Data-First" theorist do not transform or curate data since they are "market processes" and, according to Hendry (2011), are subject to three kinds of unpredictabilities: intrinsic, instance and extrinsic. In other words they "let the data speak for themselves".
However, I don’t think the vast majority of economics are influenced by F53 Positivism or Popperian Falsificationism in strict sense inasmuch as RBC and DSGE models (the widespread models in Economics), whose predictive power is not good, have not being ruled out. Professor Mário Amorim Lopes explanation about popperian epistemological approach on social sciences was really clear and contundent. For instance, these models were not able to predict 2007/08 Financial Crisis, they were not able to survive falsifications, albeit they are still used for the vast majority of Central Banks in several countries. Kirman (2010) stated that “The Economic Crisis is a Crisis for Economic Theory”.
Now the question is, are DSGE models the best theory available? Are there other theories which are able to predict Economic Crisis? Kirman (2010) supports the idea that Shiozawa sensei stands for (so do I): viewing an economy as a complex adaptive system, a set of interdependent elements (agents) organized in networks (without a central control) which produce emerging aggregates and have the properties of adaptation and self-organization. In that sense, to overcome DSGE scenarios with representative agents, rational expectations, walrasian law (markets empying) and stochastic trend; it is necessary to build models that explain and predict economies with contagion, interaction, interdependence, networks and trust.
So far, we have identified that it is necessary to construct models which consider Economic Crises as inherent to the evolution of the complex system. But can we identify the evolution of the system? This responsibility lies in two different hypotheses: i) Former economic theories that have been ignored like the Financial Instability Hypothesis by Hyman Minsky; and ii) Approaches from other disciplines such as: Econophysics (see Jovanovic, F. y Schinkus, C., 2013; Rickles, D., 2008 and Sornette, D., y Zhou, W., 2007).
Allow me to discuss some ideas on Econophysics (I am deeply interested in this field). First of all it is necessary to recall that Financial Markets Data present certain stylized facts: i) Fat-tailed distributions (Instance Unpredictability, Hendry (2013) – Taleb’s Black Swan); ii) Volatility; iii) Autocorrelations (memory); iv) Leptokurtosis and v) Clustering. According to this, the normal distribution, martingales and random-walks which are the battlehorses of Eficient Market Hypothesis by Fama and therefore DSGE models, does not shed a light on Financial Market Data. On the other hand, Econophysics put forward the use of “Truncated Levy-Pareto” distributions which address all those stylized facts stated above. These distributions are bell-shaped like Gaussian distributions but unlike these ones, they assign bigger probability to the events in the center and the tails of the distribution (Economic Crises). (Jovanovic, F. y Schinkus, 2013).
In that sense, given that Econophysics view the economies as a Complex Adaptative System and provides a good explanation on Economic Crises, why DSGE models are still used? I think he answer to this question responds to interests (professor Karlsson emphasized on it above) and the arrogance of most Orthodox Economists. They are reluctant to ruling out DSGE models and accept developments coming from other disciplines outside economics. I do agree with Moisés Naím when he states that “while there may be budding intentions to appeal to other disciplines in order to enrich their theories (especially psychology and neuroscience), the reality is that economists almost exclusively study—and cite—each other”.  (http://www.theatlantic.com/business/archive/2015/04/economists-still-think-economics-is-the-best/390063/)
To sum up, I think neither Friedmanian Positivism nor Popperian Falsificationism is followed in strict sense by the vast majority of economists. The current bulk of models do not care about predictions, they just follow the “discipline of equilibrium” (Representative Agents, Walras law and Rational Expectations). I exposed the example of Economic Crises and Financial Markets inasmuch as it is the most important falsification of DSGE models, but there are other falsifications in other fields like Economic Growth and Development (my dissertation states about it but unfortunately it is in Spanish and I have not translated it to English yet, my apologies).
Thanks a lot for sharing your valuable concepts on this related topics
Édgar    
REFERENCES
1. Hendry,D. (2011). "Unpredictability in Economic Analyis, Econometric Modelling and Forecasting," Economics Series Working Papers 551, University of Oxford, Department of Economics.
2. Jovanovic, F. y Schinkus, C. (2013a). Towards a transdisciplinary econophysics, Journal of Economic Methodology. Volume 20, pp. 164-183
3. _______________________ (2013b). Econophysics: A new challenge for financial economics? Cambridge University Press, 319-352.
4. Kirman, A. (2010). The economic crisis is a crisis for economic theory. CESifo Economic Studies 56: 483-535.
5. Rickles, D. (2008). Econophysics and the complexity of financial markets, Handbook of the philosophy of science, Volume 10, pp. 133-152.
6. Sornette, D., y Zhou, W. (2007). Self-organizing ising model of financial markets. The European Physical Journal B, 55(2), 175-181
  • asked a question related to History and Philosophy of Science
Question
7 answers
Verificationism (according to Wikipedia) is an epistemological and philosophical positioning that considers necessary and sufficient a criterion of verification for acceptance or validation of a hypothesis, a theory or a single statement or proposition. Essentially the verificationism says that a statement, added to a scientific theory, which can not be verified, is not necessarily false, but basically meaningless because it is not demonstrable at the empirical evidence of the facts. There could in fact be multiple statements inherently logical for the explanation / interpretation of a certain phenomenon, which, however, in principle only one by definition is true.
Nonsense does not mean false; only its value of truth can not be decided and then such a proposition can have no claim to be cognitive or foundational in scientific theory. It is defined a proposition any statement that may be assigned a truth value (in the classical logic, true or false). A proposition for which it is not possible to attribute this value is therefore a statement devoid of verifiability and so, for this kind of epistemology, not with any sense, and finally to be eliminated as mere opinion or metaphysical proposition. Verificationism is usually associated with the logical positivism of the Vienna Circle, in particular to one of its greatest exponents, Moritz Schlick, whose basic thesis can be summarized as follows:
The propositions with sense are those that can be verified empirically.
Science through the scientific method is the cognitive activity par excellence, since bases the truth of his propositions on this verificationist criterion .
The propositions of metaphysics are meaningless as they are based on illusory and unverifiable concepts .The propositions of metaphysics, says Carnap, express at most feelings or needs.
The valid propositions are, as had claimed the English empiricist Hume, the analytical ones, which express relationships between ideas (like mathematical propositions), and propositions that express facts (such as the propositions of physics). Math, as logic, does not express anything of the world, it should not be empirically verifiable, but must serve to concatenate propositions among themselves those verifiable and meaningful to give them the character of generality that is missing for the contingent propositions.
• The purpose of philosophy is to perform a critique of knowledge in order to eliminate all nonsensical propositions that claim to be cognitive. The philosopher must be able to perform on the language both a semantic analysis (relationship reality-language) and a syntactic analysis (ratio of the signs as they are linked together).
Verificationism has as a structural basis to find a connection between statements and experience, that is, sensations that give meaning to those. This connection is called verification.
The epistemological attitude that gives rise to verificationism, can be found within the history of philosophy and science as early as the Greek philosophy, to Thomas Aquinas passing by William of Occam, and English empiricism, positivism and Empiriocriticism of Avenarius and Mach.
According to English empiricism (whose leading exponents can be considered Locke, Berkeley and Hume) the only source of knowledge is experience.
As Berkeley says, in fact, "the objects of human knowledge are or ideas really impressed by the senses or ideas formed with the help of memory and imagination composing or dividing those perceived by the senses." So there is no other way of formulating sentences or judgments from the data of experience and the only way to verify the truth value is still using experience. The judgments that are thus based on data that can not be verified through experience do not have sense and are therefore to be rejected as unscientific.
A position that seriously reflects the consequences of empiricism is the version of Hume, who, considering that only experience can provide the truth value of a proposition, rejects all of them that claim to have universal validity. A law becomes true only if verified, but once it is verified, through experience, nothing can guarantee that the experience will occur whenever you present similar conditions that made it possible. The verification of an empirical proposition is always contingent, never needed. Difficult for Hume, therefore, is to give a definitive foundation to the same science in the traditional sense, i.e. as a set of knowledge that be certain and necessary.
Sciences, says the positivist Comte, must seek the immutable laws of nature and as such be verified regardless of any contingent experience that shows them to senses or should occur whenever the law so provides.
Some positivists (principle of verification ‘strong’) note, however, that the principle of verifiability makes significant some metaphysical judgments, such as "The soul is immortal." Indeed, there is a method of verification and simply “wait a while and die”. To avoid that statements of this type can be equipped with sense, it is processed a stronger version of the principle of verifiability. This states that a judgment has meaning only because it can be shown definitively true or false; i.e. it must give an experience that can show this value of truth.
This version is called strong because of the fact that it excludes that any knowledge be given  that is not empirical and logical and therefore excludes that a sense can be given to any expression that is not the result of empirical knowledge or logical deduction derived from empirical propositions. This version of verificationism will be criticized by some positivists less radical, as Neurath and Carnap, for the simple fact that, if to give sense to a proposition is necessary its verification, even the principle of verifiability itself must be verified, and this It is not possible.
Numerous propositions of common use, whose meaning seems clear for the terms that we use, are unverifiable as statements that express the past or the future, such as “Churchill sneezed 47 times in 1949” or "Tomorrow it will rain." These propositions can, in principle, be verified, then it can be provided a method for the verification and for the principle of verifiability ‘weak version’ are equipped with meaning, but not for the ‘strong version’; they are only nonsense.
There are to be rejected the assertions about the Absolute and in general of metaphysical nature, at least as propositions to which it is possible to apply the positive verificationist method, even though this does not exclude its existence: to try to deny a metaphysical proposition has the same meaning as to try to prove it. The metaphysical propositions are therefore omitted, unrebutted.
Comte rejects the so-called absolute empiricism, which states that any proposition that is not established by the facts is to be rejected as senseless and therefore not liable to be taken as a scientific proposition.
Special mention must be made of math, no science, for Comte, but language and therefore the basis of any positive science. Mathematics as well as logic, as will say the logical empiricists, has the purpose of showing the connections between propositions in order to maintain the truth value of these, not to produce new values. The propositions of mathematics are ‘a priori’ truth, therefore, as such, can not be verified and therefore they say nothing of the world, but tell us how of the world it must be spoken after having experienced it.
The critique perhaps best known to the principle of verifiability is provided by Popper. He, though being its main critic, never abandons the beliefs set in the positivist poster and the idea that science has a rational and deductive structure, though describable in ways other than those contemplated by Schlick. In particular the principle of verification, weak and strong version, is abolished and replaced by that of falsifiability. This principle is in fact an admission of the impossibility of science to arrive at statements that they claim to be checked as they are, and also a condemnation of the principle of induction when it claims to provide a basis for the formulation of necessary laws . Popper says that billions of checks are not enough to determine if a given theory is certain; it is enough a falsification to show it is not true. The criterion of controllability of Carnap becomes the possibility of a statement to be subjected to falsification and the structure of science, as already stated by Hume, is that it does not confirm the hypothesis, to the maximum falsifies it. The experiments themselves to which are subject the laws of science are useful when trying to falsify the laws themselves foreseen by them and not if they try to verify them.
Criticism burying verificationism come from the so-called post-positivist epistemology, whose leading exponents are Kuhn, Lakatos and Feyerabend. In varying degrees all three claim that a fact can not be verified because the bare facts not even exist, but can only be represented in a theory already considered scientific. Therefore, there is no distinction between terms of observation and theoretical terms, and even the same concepts considered basic of science possess the same meaning if designed within two different theories (think for example to the concept of mass for Newton and Einstein) . According to post-positivism also science itself is not empirical because even its data are not empirically verifiable and there is no criterion of significance, that is, it is not possible to separate a scientific statement from one that concerns other human activities.
Now, finally, we follow the position of Professor Franco Giudice for whom in the work “Controllability and meaning” (1936-1937) Rudolf Carnap recognizes that absolute verification in science is almost impossible. It must, therefore, change the criterion of significance; the principle of verification must be replaced with the concept of confirmation: a proposition is significant if, and only if, it is confirmable. The criterion of verifiability of propositions consists only of confirmations gradually increasing. Thus, the acceptance or rejection of a proposition depends on the conventional decision to consider a  given degree of confirmation of the proposition as sufficient or insufficient. Then, the meaning of a proposition is determined by the conditions of its verification (verification principle): a proposition is significant if, and only if, there is an empirical method for deciding if it is true or false. If such a method is not given, then it is an insignificant pseudo-proposition.
Relevant answer
Answer
What you are referring to when saying "There could in fact be multiple statements inherently logical for the explanation / interpretation of a certain phenomenon" is what philosophers of science call 'the underdetermination of theory by data'.  That is, the same phenomenon can be explained by multiple theories, which are often incompatible with each other.  this underdetermination increases when our theories postulate entities or processes that are unobservable.  Thus, given the nature of the scientific method and especially the inherent problem of underdetermination, no philosopher of science (not even the positivists) ever endorsed verificationism. This is because, whether one is dealing with universal laws or statistical laws, the verification of a hypothesis would require that all possible cases covered by that hypothesis be tested and that each of these tests confirm the hypothesis.  But it is impossible to test all possible cases, thus the best we can have is a high degree of confirmation.  One must recall that, for the logical positivists, the only real statements that could be verified were either analytic statements, whose truth could be established a priori via a simple analysis of the relation between subject and predicate, and observation statements, whose truth could be established by comparing the statement with a direct observation. But, scientific laws and hypotheses do not meet either of these criteria because they explain phenomena by reference to theoretical entities (which are unobservable by definition) and because they cover all possible cases of the phenomena in question (which cannot be observed by definition).  In The Philosophical Foundations of Physics, Rudolf Carnap, himself one of the great proponents of logical positivism, argues precisely this point by stating "At no point is it possible to arrive at complete verification of a law.  In fact, we should not speak of 'verification' at all - if by the word we mean definitive establishment of truth - but only of confirmation." 
  • asked a question related to History and Philosophy of Science
Question
71 answers
Should hypotheses always be based on a theory? I will provide an example here without variable names. I am reading a paper where the authors argue that X (an action) should be related to Y (an emotion). In order to support this argument the authors suggest that when individuals engage in X, they are more likely to feel a sense of absorption and thus they should experience Y. There is no theory here to support the relationship between X and Y. They are also not proposing absorption as the mediator. They are just using this variable to explain why X should lead to Y. Would this argument be stronger if I used a theory to support the relationship between X and Y? Can someone refer me to a research paper that emphasizes the need for theory driven hypotheses? Thanks!
Relevant answer
Answer
A hypothesis is a tentative proposition or posit based on insufficient knowlege to be sure that it is factual. A hypothesis is proposed for testing.
If much testing affirms the correctness of a hypothesis, and it is generaly accepted, it then can become accepted as a theory. However, theorys can still be challenged and they may be modified or even discarded altogether, if much contrary knowledge is acquired and presented.
If a theory is rock-solid and apparently is beyond any dispute, it can be accepted as a law. There are laws in physics, for example. However laws are very scarse, or non-exsitent, in other diciplines such as biology.  
Paradigms  are also interesting, if you are keen. They are, very roughly, generaly accepted principles within which research is coducted, but they may be overthrown and replaced by a new paradigm during a scientific revolution.
Note that in non-scientific language, in common speech, even an idea or a train of thought may commonly be referred to as a 'theory', and the word hypothesis is not generaly known or used, and law is usualy used only to refer to the legal system.
I hope this helps Alex,
Regards,
Keith
  • asked a question related to History and Philosophy of Science
Question
94 answers
I am quite surprised everybody says Galileo is the one who first scientifically described the relativity of motion which is contrary to the fact that at least Copernicus did it earlier and  in quite explicit form:
"Every observed change of place is caused by a motion of either the observed
object or the observer or, of course, by an unequal displacement of each. For when things move with equal speed in the same direction, the motion is not perceived, as between the observed object and the observer."
NICHOLAS COPERNICUS OF TORUÑ,  THE REVOLUTIONS OF THE HEAVENLY SPHERES 1543.
I am also surprised from time to time by statements that it was Galileo who proposed heliocentric system.
Its an interesting aspect of distortion of historical facts. Any thoughts or other examples of similar injustice? Why does it take place?
Relevant answer
Answer
Apart from the issues of the relativity of motion and the heliocentric picture of what we now call the solar system, you ask why injustices of this kind take place. That's not really a philosophical question, but my unphilosophical answer is that the professionals and academics who pontificate en passant upon the history of science have usually not done their homework, nor do they care to do it, preferring to pass on whatever gossip or confabulation fits their rhetorical contrivance of the moment. It's even worse than you might think. If you have written serious scientific review articles, you will find that the "findings" of what are considered to be important scientific papers are regularly mis-described in "the literature". Authors who refer to other authors very often also don't care to do their homework. Academia has become pretty shoddy—much of what's produced is bad journalism, and there's not much lower than that! But ignorant comments on the history of science are particularly ubiquitous because so many contemporary scientists (and philosophers) think of earlier science as just a rambling, obsolete course of misguided and inept fumbling that is of no real interest — now that we know the truth! — although frequently tempted by the urge to identify among their forbears some "good guys" or "bad guys" for rhetorical purposes. As the French would say, "It's not serious".
  • asked a question related to History and Philosophy of Science
Question
277 answers
This refers to the recent experiments of Radin et al :
1) D. Radin, L. Michel, K. Galdamez, P. Wendland, R Rickenbach and A. Delorme
Physics Essays, 25, 2, 157 (2012).
2)  D. Radin, L. Michel, J. Johnston and A. Delorme, Physics Essays, 26, 4, 553 (2013).
These experiments show that observers can affect the outcome of a double slit experiments as evidenced by a definite change in the interference pattern.
It requires urgent attention from the scientific community, especially Physicists.
If these observed effects are real, then we must have a scientific theory that can account for them.
Relevant answer
Answer
@Rajat.
I don't know to which point of the debate the queer thoughts of your first paragraph are meant to be a contribution.
For science it is not a problem to be faced with empirical facts for which an explanation is presently out of reach. During most part of the ninetenth century astronomers knew that no chemical reaction could deliver the radiative power that was observed to originate in stars. One had to wait till the discovery of atomic nuclei and a preliminary understanding of their internal workings before the radiation of stars could be explained. If there are clear facts science will find an explanation, perhaps only after a few hundred years. 
Pseudoscience is characterized by the missing of clear facts. Pseudoscientists have enough knowledge of science that they can impress uncritical people but are unable to arange experiments and observations in a manner that repetition by independent groups reproduces the original findings. 
  • asked a question related to History and Philosophy of Science
Question
22 answers
I'm interested in comparing Indigenous research methods with other ancient cultures. Indigenous research methods are relatively well documented for Australian Aboriginals, New Zealand Maori and North American Indians. I was hoping to locate examples of other non-Western (non-Eurocentric) research methods used by cultures, such as China, Africa, South America, India etc. For example, what methodology did the Chinese use to develop their knowledge of Chinese medicine? I realise these methods may not have been documented or may be in a non-English language. Any leads would be helpful at this stage.
Relevant answer
Though I am not a specialist on ancient science, as Egyptologist I can recommend some references for medicine and other fields, as, for instance, J. F. Nunn, 'Ancient Egyptian Medicine', where you can easily find the medical procedures and knowledge of ancient Egyptians. You can also find some remarks in:
-N. Baum, "L'organisation du règne végétal dans l'Égypte ancienne...", in: S. Aufrère (ed.), 'Encyclopédie religieuse de l'univers végétal de l'Égypte ancienne I', Montpellier, pp. 421-443, 1999.
-N. Beaux, 'Le cabinet de curiosites de Thoutmosis III. Plantes et animaux du 'jardin botanique' de Karnak', Leuven, 1990.
-S. Uljas, "Linguistic Conciousness", in: UEE, available at the website: https://escholarship.org/uc/item/0rb1k58f 
Of course, some interesting remarks are avalaible in the classical work of C. Lévi-Strauss, 'La pensée sauvage'.
I hope this can be useful for you.
Regards
  • asked a question related to History and Philosophy of Science
Question
126 answers
While scientific cosmology rarely occurs in the work Karl Popper, nevertheless it is a subject that interested him. The problem now is whether falsifiability criterion can be used for cosmology theories.
For instance, there are certain issues in cosmology which have never been refuted, but instead the same methods are used over and over despite their lack of observational support, for instance mutliverse idea (often used in string theory) and also Wheeler DeWitt equation (often used in quantum cosmology).
So do you think that Popperian falsifiability can be applied to cosmology science too? Your comments are welcome.
Relevant answer
Answer
Clifford,
Apparently, your answer to my question is negative, namely, Poppers falsifiability is useless in exact sciences.  I fully concur with this conclusion if: 1) falsifying a theory is equivalent to refuting it, and 2) a refuted theory must be taken out of circulation.  Popper stated the first, while the second appears to be a generally accepted implication. If the latter is not true, what is the purpose of applying falsifiability to physics? However, if the implication is correct, it contradicts the whole history of physics, which shows that nothing dramatic happened to a theory which did not agree with a certain experiment.   Physicists continued using it – even unmodified - in the areas (or under conditions) where such disagreements do not occur. And frequently they were capable of modifying the theory so as to explain not only the experiment in question but a range of other phenomena.
Apparently, Popper reasoned as follows: AFTER a new theory is published, someone offers a new experiments which the theory cannot explain, and therefore it is refuted.  In reality, many authors were aware of such exceptions even BEFORE publishing their theories.  Nonetheless, they proceeded with their publications, because they believed that a theory which explained even a few phenomena has a right to exist.  They hoped that future developments will extend the range of applications of their theories. 
Here is an example.  Thomas Young’s 1801 paper had a very general title “On the Theory of Light and Colours”.  Yet, he did not plan to explain in that paper ALL phenomena of light and colors.  In fact, he applied then his theory only to 3 phenomena of colors, namely, those produced by parallel scratches on glass, by thin films, and by thick glass plates imperfectly polished.  In subsequent papers he extended his theory to a few more phenomena, then Fresnel and Arago added some more, even without modifying the theory.
One can multiply such examples at will, and the general conclusion will be that the concept of falsifiability had been useless in the older physics.  Incidentally, originally Popper introduced the concept to distinguish “scientific” theories from “non-scientific” ones, such as astrology or Marxist theory of history, which is not the same as separating “true” physical theories from the “false” ones.
  • asked a question related to History and Philosophy of Science
Question
23 answers
          My objective is to create, accumulate physical evidence and demonstrate irrefutable physical evidence to prove that the existing definitions for software components and CBSE/CBSD are fundamentally flawed. Today no computer science text book for introducing software components and CBSD (Component based design for software products) presents assumptions (i.e. first principles) that resulted in such flawed definitions for software components and CBSD.
In real science, anything not having irrefutable proof is an assumption. What are the undocumented scientific assumptions (or first principles) at the root of computer science that resulted in fundamentally flawed definitions for so called software components and CBD (Component Based Design) for software products? Each of the definitions for each kind of so called software components has no basis in reality but in clear contradiction to the facts we know about the physical functional components for achieving CBD of physical products. What are the undocumented assumptions that forced researchers to define properties of software components, without giving any consideration to reality and facts we all knows about the physical functional components and CBD of physical products?
Except text books for computer science or software engineering for introducing software components and CBSD (Component Based Design for software products), I believe, first chapter of any text book for any other scientific discipline discusses first principles at the root of the scientific discipline. Each of the definitions and concepts of the scientific discipline is derived by relying on the first principles, observations (e.g. including empirical results) and by applying sound rational reasoning. For example, any text book on basic sciences for school kids starts by teaching that “Copernicus discovered that the Sun is at the center”. This is one of the first principles at the root of our scientific knowledge, so if it is wrong, a large portion of our scientific knowledge would end up invalid.
I asked countless expert, why we need different and new description (i.e. definitions and/or list of properties) for software components and CBSD, where the new description, properties and observations are in clear contradiction to the facts, concepts and observations we know about the physical functional components and CBD of large physical products (having at least a dozen physical functional components). I was given many excuses/answers, such as, software is different/unique or it is impossible to invent software components equivalent to the physical functional components.
All such excuses are mere undocumented assumptions. It is impossible to find any evidence that any one ever validated these assumptions. Such assumptions must be documented, but no text book or paper on software components even mentioned about the baseless assumptions they relied on to conclude that each kind of useful parts is a kind of software components, for example, reusable software parts are a kind of software components. Then CBD for software is defined as using such fake components. Using highly reusable ingredient parts (e.g. plastic, steel, cement, alloy or silicon in wafers) is not CBD. If anyone asks 10 different experts for definition/description for the software components, he gets 10 different answers (without any basis in reality we know about the physical components). Only the God has more mysterious descriptions, as if no one alieve seen the physical functional components.
The existing descriptions and definitions for so called CBSD and so called software components were invented and made out of thin air (based on wishful thinking) by relying on such undocumented myths. Today many experts defend the definitions by using such undocumented myths as inalienable truths of nature, not much different from how researchers defended epicycles by relying on assumption ‘the Earth is static’ up until 500 years ago. Also most of the concepts of CBSD and software components created during past 50 years derived by relying on such fundamentally flawed definitions of software components/CBSD (where the definitions, properties and descriptions are rooted in undocumented and unsubstantiated assumptions).
Is there any proof that it is impossible to invent real software components equivalent to the physical functional components for achieving real CBSD (CBD for software products), where real CBSD is equivalent to the CBD of large physical products (having at least a dozen physical functional components)? There exists no proof for such assumptions are accurate, so it is wrong to rely on such unsubstantiated assumptions. It is fundamental error, if such assumptions (i.e. first principles) are not documented.
I strongly believe, such assumptions must be documented in the first chapters of each of the respective scientific disciplines, because it forces us to keep the assumptions on the radar of our collective conscious and compels future researchers to validate the assumptions (i.e. first principles), for example, when technology makes sufficient progress for validating the assumptions.
I am not saying, it is wrong to make such assumptions/definitions created for software components 50 years ago. But it is huge error to not documenting the assumptions, on which they relied upon for making such different and new definitions (by ignoring reality and known facts). Such assumptions may be acceptable and true 50 years ago (when computer science and software engineering was in infancy and assembly language and FORTRAN were leading edge languages), but are such assumptions still valid? If each of the first principles (i.e. assumptions) is a proven fact, who proved it and where can I find the proof? Such information must be presented in the first chapters.
In real science, anything not having irrefutable proof is an assumption. Is such undocumented unsubstantiated assumptions are facts? Don’t the computer science text books on software components need to document proof for such assumptions before relying on such speculative unsubstantiated assumptions for defining the nature and properties of software components? All the definitions and concepts for software components and CBSD could be wrong, if the undocumented and unsubstantiated assumptions end up having huge errors.
My objective is to provide physical evidence (i) to prove that it is possible to discover accurate descriptions for the physical functional components and CBD of large physical products (having at least a dozen physical functional components), and (ii) to prove that it is not hard to invent real software components (that satisfy the accurate description for the physical functional components) for achieving real CBSD (that satisfy the accurate description for the CBD of physical products), once the accurate descriptions are discovered.
It is impossible to expose any error at the root of any deeply entrenched paradigm such as CBSE/CBSD (evolving for 50 years) and geocentric paradigm (evolved for 1000 years). For example, assumption “the Earth is static” considered an inalienable truth (not only of nature and but also of the God/Bible) for thousands of years, but ended up a flaw and sidetracked research efforts of countless researchers of basic sciences into a scientific crisis. Now we know, no meaningful scientific progress would have been possible, if that error was not yet exposed. Only possible way expose such error is showing physical evidence, even if most experts refuse to see the physical evidence, by finding few experts who are willing to see the physical evidence with open mind.
I have lot of physical evidence and now in the process of building a team of engineers and necessary tools for building software applications by assembling real software components for achieving real CBSD (e.g. for achieving CBD-structure http://real-software-components.com/CBD/CBD-structure.html by using CBD-process http://real-software-components.com/CBD/CBD-process.html). When our tools and team is ready, we should be able to build any GUI application by assembling real software components.
In real science, any thing not having irrefutable proof is an assumption. Any real scientific discipline must document each of the assumptions (i.e. first principles) at the root of the scientific discipline, before relying on the assumptions to derive concepts, definitions and observations (perceived to be accurate, only if the assumptions are proven to be True):  https://www.researchgate.net/publication/273897031_In_real_science_anything_not_having_proof_is_an_assumption_and_such_assumptions_must_be_documented_before_relying_on_them_to_create_definitionsconcepts
I tried to write papers and give presentations to educate about the error, but none of them worked. I learned in hard way, that this kind of complex paradigm shift can’t happen in just couple of hour’s presentation or by reading 15 to 20 page long papers. Only possible way left for me to expose the flawed first principles at the root of any deeply entrenched paradigm is by finding experts willing to see physical evidence and showing them the physical evidence: https://www.researchgate.net/publication/273897524_What_kind_of_physical_evidence_is_needed__How_can_I_provide_such_physical_evidence_to_expose_undocumented_and_flawed_assumptions_at_the_root_of_definitions_for_CBSDcomponents
So I am planning to work with willing customers to build their applications, which gives us few weeks to even couple of months time to work with them to build their software by identifying ‘self-contained features and functionality’ that can be designed as replaceable components to achieve real CBSD.
How can I find experts or companies willing to work with us to see the physical evidence, for example, by allowing us the work with them to implement their applications as a CBD-structure? What kind of physical evidence would be compelling, when any one willing to give us a chance (at no cost to them, since we can work for free to provide compelling physical evidence)? I failed so many times in this complex effort, so I am not sure what could work? Does this work?
Best Regards,
Raju
Relevant answer
Answer
Raju,
"When any scientific discipline was in infancy, researchers are forced to make assumptions."
We are always making assumptions. Infancy or not. You've mentioned the geocentric model of the universe. It is as truth as an assumption as it is the non-geocentric model of the universe. Although I won't try it (I'm guessing it would take a while) it's probably possible to make our whole physics based on the geocentric model with no significant loss of accuracy. It's numbers and you can "engineer" your way to whatever assumption you want to believe. The reason why non-geocentric model is accepted is just because it makes more intuitive sense. So common sense is the great decider.
Science is not the art of truth, it's the art of accurate. Better science does not mean you are closer to the truth (you would have to know the truth to be able to claim that!), it just means you are able to model more accurately a phenomena. Nowadays modern societies seem to view science almost to religious status (we keep assuming dogmas, only this time with university degrees). Personally I see it as a fork. I know it, I use it and than I wash it for the next meal.
Besides the philosophical ideas I would suggest you to define some few points open to debate. It's very complicated to debate such a broad subject. It's too vague. I've read your comments and I still can't put my finger in what exactly are your trying to reform (in good part maybe because I don't have enough skill in some of the areas you're touching).
  • asked a question related to History and Philosophy of Science
Question
15 answers
I am looking for information on the history of the development of statistical significance formulae, the mathematical calculations and why they were chosen.
I would also like to learn the same about effect size.
Thanks!
Relevant answer
Answer
Depends what you mean by statistical significance. Several books cover the history of statistics and probability pre-1900 (Stigler's History of Statistics and Hacking's Emergence of Probability, being two of the most well known). For more on the past 100 years, Gigerenzer's Empire of Chance is excellent. There are others, but I'll let other commentators list their favorites.
If you mean statistical significance as just the approaches of Fisher, Neyman, Pearson, etc., to hypothesis testing and p values, their papers are available, and much discussion of them (actually, Gigerenzer is the author of one on this, something like the Id, Ego, and Super-Ego of statistical reasoning ... heh, its here (http://www.mpib-berlin.mpg.de/en/institut/dok/full/gg/ggstehfda/ggstehfda.html)). Another is Lehmann's Fisher, Neyman, and the creation of classical statistics.
  • asked a question related to History and Philosophy of Science
Question
64 answers
     
It is known that physics is empirical science, in the sense that all propositions should be verified by experiments. But Bertrand Russell once remarked that the principle of verifiability itself cannot be verified, therefore it cannot be considered a principle of science.
In a 1917 paper, Bertrand suggested sense-data to replace the problem of verifiability in physics science (http://selfpace.uconn.edu/class/ana/RussellRelationSenseData.pdf), but later he changed his mind. see http://www.mcps.umn.edu/philosophy/12_8savage.pdf
So what do you think? Is there a role for sense-data in epistemology of modern physics?
Relevant answer
Answer
Yes, I always find Tim good at lucid exposition of a problem and its origins. I am less sure that he is good at a radical solution, but that's more difficult!
  • asked a question related to History and Philosophy of Science
Question
5 answers
Section II of “The fixation of belief” [2] opens dramatically with a one-premise argument—Peirce’s truth-preservation argument PTPA—concluding that truth-preservation is necessary and sufficient for validity: he uses ‘good’ interchangeably with ‘valid’. He premises an epistemic function and concludes an ontic nature.
The object of reasoning is determining from what we know something not known.
Consequently, reasoning is good if it gives true conclusions from true premises, and not otherwise.
Assuming Peirce’s premise for purposes of discussion, it becomes clear that PTPA is a formal fallacy: reasoning that concludes one of its known premises is truth-preserving without “determining” something not known. It is conceivable that Peirce’s conclusion be false with his premise true [1, pp. 19ff].
The above invalidation of PTPA overlooks epistemically important points that independently invalidate PTPA: nothing in the conclusion is about reasoning producing knowledge of the conclusion from premises known true: in fact, nothing is about premises known to be true, nothing is about conclusions known to be true, and nothing is about reasoning being knowledge-preservative.
The following is an emended form of PTPA.
One object of reasoning is determining from what we know something not known.
Consequently, reasoning is good if it gives knowledge of true conclusions not among the premises from premises known to be true, and not otherwise.
PTPA has other flaws. For example, besides being a formal non-sequitur, PTPA is also a petitio-principi [1, pp.34ff]. Peirce’s premise not only isn’t known to be true—which would be enough to establish question-begging—it’s false: reasoning also determines consequences of premises not known to be true [1, pp. 17f].
[1] JOHN CORCORAN, Argumentations and logic, Argumentation, vol. 3 (1989), pp. 17–43.
[2] CHARLES SANDERS PEIRCE, The fixation of belief, Popular Science Monthly. vol. 12 (1877), pp. 1–15.
Q1 Did Peirce ever retract PTPA?
Q2 Has PTPA been discussed in the literature?
Q3 Did Peirce ever recognize consequence-preservation as a desideratum of reasoning?
Q4 Did Peirce ever recognize knowledge-preservation as a desideratum of reasoning?
Q5 Did Peirce ever retract the premise or the conclusion of PTPA?
Relevant answer
Answer
One shouldn't give an argument about what "good reasoning" is. One should just stipulate a definition, and leave it at that.
For what it's worth, however, perhaps Pierce didn't take himself to be giving an argument about what "good reasoning" is, and his "premise" was just his unorthodox way of stipulating a definition of "good reasoning," so that, by "the object of reasoning," he meant "that which would make reasoning good." On this reading, the purpose of his conclusion is simply to point out a CONSEQUENCE of his definition of "good reasoning" -- namely, that valid reasoning is necessary and sufficient for good reasoning. On the face of it, THIS ARGUMENT -- from the definition to the consequence of the definition -- might seem to be invalid; yet it's hard to tell without knowing what he meant by "determining," etc.
  • asked a question related to History and Philosophy of Science
Question
115 answers
In The Nature of the Physical World, Eddington wrote:
The principle of indeterminacy. Thus far we have shown that modern physics is drifting away from the postulate that the future is predetermined, ignoring rather than deliberately rejecting it. With the discovery of the Principle of Indeterminacy its attitude has become definitely hostile.
Let us take the simplest case in which we think we can predict the future. Suppose we have a particle with known position and velocity at the present instant. Assuming that nothing interferes with it we can predict the position at a subsequent instant. ... It is just this simple prediction which the principle of indeterminacy expressly forbids. It states that we cannot know accurately both the velocity and the position of a particle at the present instant.
--end quotation
According to Eddington, then, we cannot predict the future of the particular particle beyond a level of accuracy related to the Planck constant (We can, in QM, predict only statistics of the results for similar particles). The outcome for a particular particle will fall within a range of possibilities, and this range can be predicted. But the specific outcome, regarding a particular particle is, we might say, sub-causal, and not subject to prediction. So, is universal causality (the claim that every event has a cause and when the same cause is repeated, the same result will follow) shown false as Eddington holds?
Relevant answer
Answer
Social Science has always followed the mothership of science, physics. But in the last few decades, esp. post heisenberg, physics has become comfortable with quantum worldview, but the social science researchers (who study or build our perception of causality) are still stuck with classical Newtonian worldview and therefore having tough time grappling the physical science discoveries around causality. There are five main aspects where this is becoming tough:-
1) Difference of perception of space-time characteristics by different observers (Theories of Relativity)
2) Unified Theories could exist (unified field theory, string theory..)
3) Effects with no physical/ observable influence medium (Quantum Entanglement)
4) Inseparability of microcosm and macrocosm (Single Electron Universe)
5)  No measurements or probabilistic measurement (Determinstic Quantum physics) Observation may not translate to measurement always. It's Ok, if the instruments can't measure, still an observation could be useful.
Consequently, social science research outputs a narrow view of reality, the part of reality which is measurable through classical means driven by newtonian paradigm. Since reality has many more dimensions, the experiential learning and practice are getting divorced from social science research due to this bottleneck of newtonian cognitive framework of causality.   
Attached is an 1876 English translation of classical sanskrit text, which was written to know reality through substance (not matter alone) and reasoning. We are trying to bring that into research methodologies for a a better view on causality. Hope this is useful.  
  • asked a question related to History and Philosophy of Science
Question
236 answers
It was true that mathematics was done in argumentation and discourse or rhetoric in ancient times. The 6 volumes of Euclid’s elements have no symbols in it to describe behaviors of properties at all except for the geometric objects. The symbols of arithmetic: =, +, -, X, ÷ were created in the 15th and 16th centuries which most people hard to believe it - you heard me write. The equality sign “=” and “+,-“ appeared in writing in 1575, the multiplication symbol “X “ was created in 1631, and the division sign “ ÷” was created in 1659. It will be to the contrary of the beliefs of most people as to how recent the creations of these symbols were.
It is because of lack of symbols that mathematics was not developed as fast as it has been after the times where symbols were introduced and representations, writing expressions and algebraic manipulations were made handy, enjoyable and easy.
These things made way to the progress of mathematics in to a galaxy – to become a galaxy of mathematics. What is your take on this issue and your expertise on the chronology of symbol creations and the advances mathematics made because of this?
http://Notation,%20notation,%20notation%20%20a%20brief%20history%20of%20mathematical%20symbols%20%20%20Joseph%20Mazur%20%20%20Science%20%20%20theguardian.com.htm
Relevant answer
Answer
Leibniz was the master of symbol creation!  He created symbols that packaged meaning,  helped cognition, stimulated generalization, and eased manipulation.  He thought about them with care before committing to their use.  William Oughtred invented hundreds of new symbols, but hardly any of them are still in use.  Goes to show that willy-nilly made symbols don't have a good survival rate, for good reasons.
  • asked a question related to History and Philosophy of Science
Question
148 answers
The British astrophysicist, A.S. Eddington wrote (1928), interpreting QM, "It has become doubtful whether it will ever be possible to construct a physical world solely out of the knowable - the guiding principle of our macroscopic theories. ...It seems more likely that we must be content to admit a mixture of the knowable and the unknowable. ...This means a denial of determinism, because the data required for a prediction of the future will include the unknowable elements of the past. I think it was Heisenberg who said, 'The question whether from a complete knowledge of the past we can predict the future, does not arise because a complete knowledge of the past involves a self-contradiction.' "
Does the uncertainty principle imply, then, that particular elements of the world are unknowable, - some things are knowable, others not, as Eddington has it? More generally, do results in physics tell us something substantial about epistemology - the theory of knowledge? Does epistemology thus have an empirical basis or empirical conditions it must adequately meet?
Relevant answer
Answer
Jerzy, Ray Streater is relatively well-known as a spokesperson for a community of people who do not believe in any shade or stripe of wave function realism (we're not even talking of wave function monism here.)
It is then already a matter of interpretation.
Sane & sound people will rather convincingly argue that a Schrödinger equation legitimately can apply to several variables, and that there is ample experimental evidence for that - which is however basically what Streater disputes if we follow his line of reasoning.
What I'm trying to say is that this line of argument is very far from being cut and dry.
  • asked a question related to History and Philosophy of Science
Question
27 answers
Many scientists differentiate the hard physical sciences from philosophy, some even say "that's not science its philosophy". Are they missing the point in a big way?
Relevant answer
Answer
There are philosophies that apply to all sciences (e.g. how to define, design and conduct a field experiment to test a hypothesis X, whatever the research domain; e.g. Hurlbert 1984; Ecology, Psychology, Human Sciences, Ethology, Political Sciences, Behaviour, etc.....) and there are philosophies that are specific to each research domain (e.g. how to define, design and conduct a playback experiment to test the messages and meanings of bird song in a single model species; Behavioral Ecology, Ornithology).
  • asked a question related to History and Philosophy of Science
Question
21 answers
I recently published my book "The Origin of Science" which can be downloaded at https://www.researchgate.net/profile/Louis_Liebenberg/publications/ I am interested in alternative theories on the origin of science and how this debate can lead to a better understanding of how our ability for scientific reasoning evolved.
Relevant answer
Answer
Hi Mike
Hunter-gatherers not only develop applied science, but also developed knowledge for the sake of knowledge. For example, the /Gwi Bushmen of the Kalahari have eleven species-specific names for ants, including the velvet ant (a wingless wasp), and termites. They have developed a level of detail in their knowledge of ants that far exceed the practical requirements of hunting. But as you point out, that fact that we can store more knowledge than is immediately relevant prepares us for unforeseen eventualities in the future.
I think the political dimension in modern science may well stifle creative innovation by limiting academic freedom. Government funding may make it possible to get a lot of research done in the sense of Kuhn's "normal science" - but creative innovation requires institutions to allow researchers academic freedom, or alternatively creative individuals may choose to work independently. Hunter-gatherers allow a large degree of "academic freedom" in the interpretation of animal tracks and signs - ultimately it is the predictive value of hypotheses that result in successful hunts.
  • asked a question related to History and Philosophy of Science
Question
40 answers
Is it reasonable to use these terms?
A number of papers have been published a long time ago, but still have many citations.
If it is possible, then is it predictable?
Is citation can be a suitable measure to judge the useful age of a paper?
Which papers have more useful lifetime or long expire date?
Thanks for your inputs.
Relevant answer
Answer
Why is it so important? Good scientists are like poets: they HAVE TO do reseach and publish (interal motivation). If the only goal is to get citations, it is narcissism, not science. Mendel was forgotten and his findings had to be reinvented later - but that does not change the fact that he was a great scientist.
  • asked a question related to History and Philosophy of Science
Question
10 answers
Back to my 2nd semester, I still remember those boring faces trying to hide their yawning during lectures on History of Science. But I found it unexpectedly interesting. Learning the manner of approach of ancient philosophers and naturalists was quite exciting. But it seems to me that most students neglect this valuable subject because their minds seem to be preoccupied with the notion that most of the thing taught in this course, like their manner of thinking about the cosmos, the earth, their perspective towards health and medicine, are almost apparent to everybody. But what they missed are what they ought to learn, their hardwork, their practices, their mode of approach, their determination, dedication at those days when everything seems to be mysterious, when there was no thing called apparent.
So what more can we learn from our forefathers? And how can this subject be popularized esp. among youngsters?
Relevant answer
Answer
I think much depends on the professor teaching the subject. It can be boring, as anything else. For me the most interesting aspect is that science is as much part of the culture as politics, economics, philosophy, law, religion or arts. If studying long range processes it is very interesting to observe the parallels between these phenomena. For me the most important consequence of studying the history of science was to realize that our current views of the world are but temporary interpretations. It does not mean that the are meaningless or purely subjective but that they are deeply historical, in a permanent state of transition. That makes these scientists humbler. Another important aspect is that several thing that seem very new have already been invented - sometimes only conceptually. There is a lot to learn from earlier approaches and a lot of inspiration can be gained. In our present rush for impact factors and money we rarely have time to study earlier science. Unfortunately most of us (inlcuding myself) do not speak ancient languages and the translations are necessarily interpretations in modern language. Most of the historians, however, who master these languages, are not (natural) scientists, so they do not necessarily recognize the signficance of what they read. There are rareg exceptions, however.
  • asked a question related to History and Philosophy of Science
Question
8 answers
Is there a relationship between history of science and philosophy of science?
Relevant answer
Answer
Meu caro Ourides, azar de quem não puder entender Português. Foi com prazer que li sua resposta. Terminei de ler "Adeus à Razão" do Feyerabend (Editora UNESP). Vou procurar pelo menos alguns dos livros que você menciona, pois do Koyré só conheço o "Do Mundo Fechado ao Universo Infinito" e do Bachelard não li nada além do que a Abril publicou na coleção "Os Pensadores".