Science topic

History and Philosophy of Science - Science topic

Explore the latest questions and answers in History and Philosophy of Science, and find History and Philosophy of Science experts.
Questions related to History and Philosophy of Science
  • asked a question related to History and Philosophy of Science
Question
5 answers
We have learned in QM the famous U. Principle which is probably the most important thing in this branch.
We also have learned that space-time stays together in GR.
The problem of measurements in QM comes from U. Principle & vice-versa and why it is not present in GR, not in the same form but analog?
Thanks
Relevant answer
Dear Florian Millo
You may care to read my just published article 'How Come the Quantum? A Deeper Principle Behind Quantization', in IJQF (International Journal of Quantum Foundations), also uploaded onto Research Gate. In this article I (arguably) derive quantization from a deeper underlying principle. David Bohm wrote that the "transfer of a quantum is one of the basic events in the universe and cannot be described in terms of other processes". I've described it in terms of other processes. It turns out that the proposed answer (or rather, the theory put forward constituting the answer) in turn suggests answers to many of the interpretative problems of quantum mechanics, including the measurement problem.
Regards,
Mark Kristian van der Pals
  • asked a question related to History and Philosophy of Science
Question
87 answers
Why are numbers and shapes so exact? ‘One’, ‘two’, ‘point’, ‘line’, etc. are all exact. But irrational numbers are not so. The operations on these notions are also intended to be exact. If notions like ‘one’, ‘two’, ‘point’, ‘line’, etc. are defined to be so exact, then it is not by virtue of the exactness of these substantive notions, but instead, due to their being defined so, that they are exact, and mathematics is exact.
But on the other side, due to their being adjectival: ‘being a unity’, ‘being two unities’, ‘being a non-extended shape’, etc., their application-objects are all processes that can obtain these adjectives only in groups. These are pure adjectives, not properties which are composed of many adjectives.
A quality cannot be exact, but may be defined to be exact. It is in terms of the exactness attributed to these notions by definition that the adjectives ‘one’, ‘two’, ‘point’, ‘line’, etc. are exact. This is why the impossibility of fixing these (and other) substantive notions as exact misses our attention.
If in fact these quantitative qualities are inexact due to their pertaining to groups of processual things, then there is justification for the inexactness of irrational numbers, transcendental numbers, etc. too. If numbers and shapes are in fact inexact, then not only irrational and other inexact numbers but all mathematical structures should remain inexact except for their having been defined as exact.
Thus, mathematical structures, in all their detail, are a species of qualities, namely, quantitative qualities. Mathematics is exact only because its fundamental bricks are defined to be so. Hence, mathematics is an as-if exact science, as-if real science. Caution is advised while using it in the sciences as if mathematics were absolutely applicable, as if it were exact.
Bibliography
(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology, 647 pp., Berlin, 2018.
(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology, 386 pp., Frankfurt, 2015.
(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology, 361 pp., Frankfurt, 2014.
(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology, 92 pp., KDP Amazon, 2022, 2nd Edition.
(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie, 104 pp., KDP Amazon, 2022, 1st Edition.
Relevant answer
  • asked a question related to History and Philosophy of Science
Question
88 answers
CRITERIA TO DIFFERENTIATE BETWEEN
VIRTUALS AND EXISTENTS IN SCIENCE
Raphael Neelamkavil, Ph. D., Dr. phil.
Existents are in Extension (each having a finite number of finite-content parts) and in Change (existents, which are always with parts, possessing parts which always exert finite impacts on others, inclusive of exertion of finite impacts on some parts within). Can an existence without parts and exertion of impacts be thought of? Anything that is not in Extension-Change is non-existent.
The Extension-Change kind of existence is what we call Causation, and therefore, every existent is a causal Process in all parts. This is nothing but the Universal Law of Causality. That is, no more do we need to prove causation scientifically. This Law is a pre-scientific and hence physical-ontological Law, meant also for biological existents.
No quantum physics, statistical physics, or quantum cosmology can now declare that certain processes in nature are non-causal or acausal, after having admitted that these processes are in existence!
That is, existents at any level of formation are fully physical, possess at least a minimum of causal connection with others in its environment, are not merely virtual (nor fully modular / non-local / non-emergent / self-emergent / sui generis in a totally isolated manner). Therefore, any existent must have causal connections with its finitely reachable environment and within its inner parts.
Physical-ontologically real generalities must be about, or pertinent to, existents in groups, i.e., as parts of a type / natural kind. These generalities are not existents, but pure ontological universals in natural kinds.
Space and time are just the measurement-based epistemic notions or versions of the more generally physical-ontological Extension and Change respectively. The latter two are generalities of all existent processes, because nothing can exist without these two Categories.
Hence, space and time are not physical-ontological, not real about, not pertinent to, existents. In short, physical science working only on measuremental space-time cannot verify newly discovered energy wavicles and matter particles by use of the physical “properties” they are ascribed to. The reasons are the following.
We can speak not merely of existents but also about their “qualities / universals” and about non-existent “beings” and “properties”. All of them are denotables. Thus, a denotable has reference to something that either (1) has a physical body (physically existent processes), or (2) is inherent in groups of physical processes but are not themselves a physical body (pure universal qualities of all description), or (3) is non-real, non-existent, and hence just a mere notion (e.g., a non-physical possible world with wings, or one with all characteristics – i.e., Extension and Change – absolutely different from the existent physical world).
Denotables of type (1) belong to existent realities, namely, physical processes. They are of matter-energy in content, because Extension-Change determine them to be so. To denotables of type (1) belong also theoretically necessary realities, which are composed theoretically of methodical procedures using properties of existents, which, as a rule, (a) may be proved to be existing (i.e., existent unobservables) or (b) may not be proved to be existing (non-existent unobservables, which are just virtual objects) but are necessary for theory (e.g., potential energy).
To type (2) belong those universals that are never proved to exist but belong to all existents of a group as the general qualities of the members. These are termed ontological universals. The denotables of (1b) are the sub-types that are either fully virtual or partially virtual but are necessary for theory. Both are theoretically useful, but are often mistaken as being existents. Denotables of type (3) are nothing, vacuous. These are pure imaginations without any success in being proved to be in existence.
The difference between non-existent, real, virtual, and existent denotables is this:
Non-existents have no real properties, and generate no ontological commitment to existence via Extension and Change. Real virtuals have the properties that theoretically belong to the denotables that are lacunae in theory, but do not have the Categorial characteristics, namely, Extension and Change. Existent denotables (a) have these Categories (characteristics), (b) generate ontological commitment to existence, and (c) possess also properties that are conglomerations of many ontological universals. All ontological universals are under obedience to Extension and Change.
Hence, virtuals are versions of reality different from those that have been proved as actual existents. They are called in general as unobservables. Some of them are non-existent. When they are proved to exist, they become observables and partial observables, and are removed from membership in virtuals. Some partial observables may yet be considered as not yet proved to be existent. They happen further to be called unobservable virtuals. Some of them do not at all get the status of existent observables or existent partial observables. They belong to group of purely vacuous notions (3) above.
Theories yield unobservables (electrons, neutrinos, gravitons, Higgs boson, vacuum energy, dark energy, spinors, strings, superstrings …). They may be proved to exist, involving detectable properties.
Note that properties are not physical-ontological (metaphysical) characteristics, which latter I call ontological universals, the two most important of which are the Categories: Extension-Change. Instead of being ontological universals, properties are concatenations of ontological universals.
Virtual unobservables fill the lacunae in theoretical explanations, and most of them do not get proved as existent. Nevertheless, they will continue to be useful virtual worlds for theory from the viewpoint of explanation in a state of affairs where there are no ways of explanation using existent unobservables.
As is clear now, the tool to discover new unobservables is not physical properties of which physical and social sciences speak a lot, but instead, the physical-ontological Categories of Extension and Change.
Mere virtuals are non-existent as such, but are taken as solutions to the lacunae in rational imagination. The sciences and many philosophies of the sciences seem not to differentiate between their denotables in the above manner.
I have spoken of universals here, which may fall in distaste for the minds of physicists, scientists of other disciplines, and even for some philosophers. Please note that I have spoken only of the generalities that we are used to speak of regarding existent types of things. I have not brought out here all my theory about kinds of universals.
My claim in the present discussion is only that properties are also just physical virtuals, if we have the unobservables (say, vacuum energy, dark energy, etc.) behind them not fully steeped in physical existence in terms of EXTENSION and CHANGE through experimentally acceptable proofs of existence.
Do we have a science that has succeeded to accept this challenge? Can the scientists of the future accept these criteria for their discoveries?
Bibliography
(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology, 647 pp., Berlin, 2018.
(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology, 386 pp., Frankfurt, 2015.
(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology, 361 pp., Frankfurt, 2014.
(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology, 92 pp., KDP Amazon, 2022, 2nd Edition.
(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie, 104 pp., KDP Amazon, 2022, 1st Edition.
Relevant answer
  • asked a question related to History and Philosophy of Science
Question
68 answers
SCIENTIFIC METAPHYSICAL CATEGORIES
BEYOND HEIDEGGER
ENHANCING PHYSICS
Raphael Neelamkavil, Ph. D., Dr. phil.
1. Introduction beyond Heidegger
I begin my cosmologically metaphysical critique of the foundations of Heidegger’s work, with a statement of concern. Anyone who attempts to read this work without first reading my arguments in the book, Physics without Metaphysics?, (1) without being in favour of a new science-compatible metaphysics and concept of To Be, and (2) without a critical attitude to Heidegger – is liable to misunderstand my arguments here as misinformed, denigrative, or even trivial. But I do this critique in search of very general means of constructing a metaphysics capable of realising constant guidance and enhancement to scientific practice.
Contemporary mathematics, physics, cosmology, biology, and the human sciences have a shape after undergoing so much growth that we cannot think philosophically without admitting the existence (termed “To Be”) of all that exist, the cosmos and its parts. The general concept of existence is always as “something-s” that are processually out there, however far-fetched our concepts of the various parts of or of the whole cosmos are. “The existence of the totality (Reality-in-total) as the whole something whatever” and “particular existence in the minimally acceptable state of being something/s whatever that is not the whole totality” are absolutely trans-subjective and thus objectual presuppositions behind all thought.
Today we do not have to theoretically moot any idea of non-existence of the cosmos and its parts as whatever they are. This is self-evident. That is, basing philosophical thinking – of the very nature of the existence-wise metaphysical presuppositions of all that are subjective and objective – upon the allegedly subjective origin of thought processes and concepts – should be universally unacceptable.
Therefore, I think we should get behind Heidegger’s seemingly metaphysical words – all based on the human stage on which Being is thought – by chipping his prohibitively poetical and mystifying language off its rhetorically Reality-adumbrating shades, in order to get at the senses and implications of his Fundamental Ontology as Being-historical Thinking. It suffices here to admit that the history of Being is not the general concept of the history of the thought of Being, and not the history of the thought of Being.
Moreover, it is not a necessity for philosophy that the Humean-Kantian stress on the subject-aspect of thought be carried forward to such an extent that whatever is thought has merely subjectively metaphysical Ideal presuppositions. All subjective presuppositions must somehow be taken to possess the merely subjective character.
There are, of course, presuppositions with some conceptual character. But to the extent some of them are absolute, they are to be taken as absolutely non-subjective. These presuppositions are applicable without exception to all that is, e.g. To Be and all Categories that may be attributed to all that exist. HENCE, SUBJECTIVE PRESUPPOSITIONS ARE NOT A SUBSTITUTE FOR CONCEPTUAL PRESUPPOSITIONS.
This fact should be borne out while doing philosophy, without which no philosophy and science are possible. The weight of the subject-aspect continues to be true of thought insofar as we go to non-absolute details of metaphysical presuppositions and empirical details, and not when we think only of the metaphysical Ideals of all existents in themselves.
It is true that there is no complete chipping off of the merely subjective or anthropological aspect of the Heideggerian theory. Nor is there an analysis without already interpreting anything. The guiding differentiation here should be that between “the subjective” and the “conceptual”. The conceptual is not merely subjective, but also objective. It is objective due to the inheritance pattern behind it from the objectual.
Such a hermeneutic is basic to all understanding, speculation, feeling, and sensing. The linguistically and otherwise symbolic expression of concepts and their concatenations is to be termed as the denotative universals and their concatenations.
At the purely conceptual level we have connotation. These are purely conceptual universals and their concatenations. Since these are not merely a production of the mind but primarily that by the involvement of the generated data from the little selection of the phenomena from physical processes, which are from a highly selected group of levels of objectual processes, which belong to the things themselves.
At the level of the phenomena, levels of objectual processes, and the things themselves there are universals, which we shall term ontological universals and their conglomerations. These conglomerations are termed so because they have the objectual content at the highest level available within the processes of sensing, feeling, understanding, speculation, etc.
2. Conclusions on Heidegger Proper
The above should not necessarily mean (1) that we cannot base thought fully on the Metaphysical Ideals of “To Be” and “the state of existents as somethings”, and (2) that we cannot get sufficiently deep into the fundamental implications of his work by side-lining the purely subjective concepts of the fundamental metaphysical concepts. This claim is most true of the concept of To Be.
To Be is the simultaneously processual-verbal and nomic-nominal aspect of Reality-in-total, and not merely that of any specific being, phenomenon, or concept. For Heidegger, To Be (Being) is somehow a private property of Dasein, the Being-thinking being. To Be which is the most proper subject matter of Einaic Ontology (metaphysics based completely on the trans-thought fact of the Einai, “To Be” of Reality-in-total) is not the Being that Dasein thinks or the Being that is given in Dasein, because To Be belongs to Reality-in-total together and in all its parts.
Even in Heidegger’s later phase highlighted best by his Contributions to Philosophy: From Enowning, his concept of To Be as belonging to the Dasein which is the authentically Being-thinking human being has not changed substantially. Even here he continues to project positively the history of Being-thinking human being as the authentic Being-historical process and as the essence of the history of all that can be thought of.
Against the above metaphysical backdrop of essentially anthropocentric definitions, I write this critique based on cosmological-metaphysical necessities in philosophy, and indirectly evaluate what I consider as the major ontological imperfection in Heidegger’s thought from the viewpoint of the Categorial demands of the history of metaphysics, various provincial ontologies and scientific ontology, and of the way in which I conceive the jolts and peaks in such history.
Along with the purely meta-metaphysical To Be, (1) I present the metaphysical abstract notions of Extension (= compositeness: i.e., having parts) and Change (= impacts by composites: i.e., part-to-part projection of impact elements) as the irreducibly metaphysical Categories of all existents and (2) argue that Extension-Change existence in their non-abstract togetherness as existents is nothing but Universal Causation (= everything is Existence-Change-wise existent, i.e. if not universally causal, existence is vacuous).
These are metaphysical principles that Heidegger and most philosophers till today have not recognized the primordiality of. Most of them tend to fix to existence universal or partial or absolutely no causality. In short, Universal Causation, even in some allegedly non-causal aspects of cosmology, quantum physics, philosophy of mind, and human sciences, is to be the taken as a priorias and co-implied by existence (To Be), because anything existent is extended and changing...! No more should sciences or philosophy doubt Universal Causality. Herein consists the merit of Einaic Ontology as a universally acceptable metaphysics behind all sciences – not merely of human sciences.
To Be is the highest Transcendental Ideal; Reality-in-total is the highest Transcendent Ideal; and Reality-in-general is the highest Transcendental-Transcendent Ideal of generalized theoretical concatenation of ontological universals in consciousness. These are meta-metaphysical in shape. They are not at all classificational (categorizing) of anything in this world or in thought.
Although Heidegger has not given a Categorial scheme of all existents or Categorial Ideals for all metaphysics and thinking, he is one of the few twentieth century thinkers of ontological consequence, after Aristotle (in favour of an abstract concept of Being) and Kant (against treating the concept of Being as an attribute), to have dealt extensively with a very special concept of Being and our already interpretive ability to get at To Be.
I present here in gist the difference between the Dasein-Interpreted concept of Being and the ontologically most widely committed, Einaic Ontological, nomic-nominal, and processual-verbal concept of To Be, which should be metaphysically the highest out-there presupposition of all thought and existence. This is the relevance of metaphysics as a trans-science.
Bibliography
(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology, 647 pp., Berlin, 2018.
(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology, 386 pp., Frankfurt, 2015.
(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology, 361 pp., Frankfurt, 2014.
(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology, 92 pp., KDP Amazon, 2022, 2nd Edition.
(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie, 104 pp., KDP Amazon, 2022, 1st Edition.
Relevant answer
  • asked a question related to History and Philosophy of Science
Question
14 answers
Dear Readers,
This is the latest news from my ongoing work to help bring the arts into better communications with the sciences. Have you studied Poe's texts that may be considered proto-science fiction (before this term existed?) Please share your work here, as well.
Wolf Forrest will speak at our Hard-Sci SF Zoom group on Jan. 6th and we will then publish that talk on YouTube. DETAILS AS ATTACHMENT.
MORE on Hard-Science SF Zoom project:
We have a Zoom group called The Hard-Science Science Fiction Group. If anybody wishes to be part of this group project or "lab" please let me know via message. I will add you to the list because I don't always rmember to post new talk od plays here. (Too busy.)
We also are involved with the Planet Zoom players, who record Zoom play adaptation of classic SF and then publish these on YouTube. (Many good works were never adapted to film or TV.)
Please add your thoughts on this. Sometimes we in the US miss what other countries and linguistic groups think of Poe.
Relevant answer
Answer
Gloria Lee Mcmillan – Interesting collection of ideas! It seems to me the problem is in the question, not the answer! Indeed, why Poe? What fascinates me more about Poe is his belittlement by the english-speaking mainstream academia and his sensational impact on french thought, literature and music. Ravel rated him as one of his three teachers.
How do you pull together a category labelled "science fiction"?
At one extreme you have, essentially, cowboys and injuns in space. The science in this kind of fiction rarely strays from upgraded versions of already-familiar things. Wars in Star Wars are fought by ground troops armed with inaccurate rifles or by pilots time-snatched from the battle of Britain. And Star Trek uses the trope to play out human, social and often ethical dramas, with a clear intent of presenting us with a morality play. Though add in Michelle Yeoh and I'll happily watch it!
At another other end is Connie Willis, whose only science-fiction element is her use of time travel, whose writing stands with the best of contemporary writing, regardless of genre.
At a third pole is writing that extrapolates trends in science to imagine their societal consequences. This is Asimov territory, whose fascination with the social and moral implications of robots has proved prescient. Not to mention the powerful ambiguities of Do Androids dream…
And what of dystopian literature? Extrapolation of social and environmental trends? The Handmaid's Tale? The Road? Ridley Walker? A Canticle for Leibowitz?
Is all of this science fiction? Yes, I say. But what sort of definition encompasses them?
  • asked a question related to History and Philosophy of Science
Question
197 answers
Einstein is one of the greatest and most admired physicists of all times. Einstein's general theory of relativity is one of the most beautiful theories in physics. However, every theory in physics has its limitations, and that should also be expected for Einstein's theory of gravity: A possible problem on small length scales is signaled by 90 years of unwavering resistance of general relativity to quantization, and a possible problem on the largest length scales is indicated by the present search for "dark energy" to explain the accelerated expansion of the universe within general relativity.
Why, then, is the curvature of spacetime so generally accepted as an ultimate truth, as the decisive origin of gravitation, both by physicists and philosophers? This seems to be a fashionable but unreflected metaphysical assumption to me.
Are there alternative theories of gravity? There are plenty of alternatives. As a consequence of the equivalence of inertial and gravitational mass, they typically involve geometry. The most natural option seems to be a gauge field theory of the Yang-Mills type with Lorentz symmetry group, which offers a unified description of all fundamental interactions and a most promising route to quantization.
I feel that metaphysical assumptions should always be justified and questioned (rather than unreflected and fashionable). How can such a healthy attitude be awakened in the context of the curvature of spacetime?
Research areas: Theoretical Physics, Philosophy of Science, Gravitation, General Relativity, Metaphysics
Relevant answer
Answer
I don't feel that it is helpful to state that one group of famous people (G1) has demonstrated that the approach of another group of famous people (G2) is definitely and totally wrong. We need open-minded, critical and respectful discussions rather than absolute claims about ultimate truths (which are always questionable). I don't intend to become a follower of G1 instead of G2 or any other Gj. All these people lived centuries ago, and we now have so many new experimental observations, theoretical ideas and powerful tools at our fingertips. I am convinced that each of those famous people, if they lived today, would critically rethink their own previous ideas in the light of these fantastic new resources -- this is how they would set a shining example for us by not following any dogmatism of any group.
  • asked a question related to History and Philosophy of Science
Question
127 answers
Studying the various philosophers, even the contemporary thinkers, is a matter of study and analysis. Whatever our stage of development is, such study and analysis can only be educating ourselves in the strict sense. Thinking for ourselves is also part of the process, which should have greater weightage as the educative phase is had long enough.
Now what about forgetting for some time the contributions of the many philosophers of our time or of the past, especially the kind whom we all mention habitually, and then theorizing philosophically for ourselves without constant references to their works and notions, as doctoral students do?
Why do I suggest this? Such dependence on the works of the stalwarts and of the specialists on them may veil our abilities to see many things for ourselves. Thus, we can avoid becoming philosophical technicians and even the slightly better case of becoming philosophical technologists or philosophical experts.
I believe that synthesizing upon some good insights from the many thinkers and from the many disciplines would require also the inevitable conceptual foundations that we would be able to discover beyond these notions.
Suppose each of us looks for such foundations, and then share them on a platform. If the discussion is on these new foundations, something may emerge in each of us as what we could term genuine foundations. These need not remain forever, because philosophy and science show grow out of whatever we and others have done. But, as a result of the effort, we will have effected a better synthesis through such personal efforts than when without seeking foundations.
I think the conceptual foundations on which the concept of synthetic philosophy works may thus gain a lot. I for one consider the whole history of analytic and linguistic philosophy as lacking such rigour. You all may differ from what each one of us suggests. That is the manner in which deeper foundations can be sought. I am on such a journey.
I believe that in the journey to find deeper and more general foundations than those available, we will already have created a manner of doing philosophy independently, and if done in conjunction with the sciences, we will have a new manner of doing the philosophy of science. Fell trees from their roots, and we have the place to plant a new tree.
Let me suggest a question. All these 2.5 millennia of western philosophy, we have not found the question of the implications of existence (to exist, To Be) being discussed. Plato and Aristotle have tried it, and thereafter we do not see much on the implications of To Be. Now if some implications of To Be are found, these could be a strong foundation for philosophy of any kind. I hope we cannot find such implications of Non-existence for doing philosophy or science. The definitions of the implications of To Be will change in the course of time, but some core might continue to remain, if we do something validly deep and general enough.
Let me suggest an interesting manner in which many philosophers evaluate their peers. (This may also be applicable in all other fields.) This is here brought to a historical context, not merely theoretical. This I do in order to make the example very clear.
Suppose you (say, A) speak of space, time, entities, matter-energy, etc. in a special context. The peer (say, B) gets hold of the text and starts criticizing A’s notion of space, time, entities, matter-energy, etc. B starts from the concepts of space and time. He says, Kant and thereafter almost all thinkers have placed space and time merely as epistemic categories. This has been done in the context of phenomena. If you (A) hold the epistemic variety of notions of space and time, then they are phenomenal. In that case, you should have studied in the text what phenomena meant in Kant and analyze the scientific and philosophical consequences of those concepts.
B continues. If you wanted to make space and time metaphysical concepts, then you are speaking of the noumena. For Kant these are unknowables. Hence, you need to first show that the noumena are knowables. In that case you are rightful in suggesting epistemic / epistemological concepts of space and time. If not, you need to take recourse to other relevant philosophers or scientific disciplines to demonstrate the metaphysical meaning of space and time that you have introduced. And so on.
Absolute dependence upon the traditions and unpreparedness to think differently from the past or present thinkers is what is exhibited here. Not that B is not intelligent enough. B is. But the preparedness to think for years and decades differently comes not merely from the desire to think differently, but from the desire to SOLVE ALL THE PROBLEMS OF THE WORLD TOGETHER. We know we are being overambitious. If we demonstrate such an attitudes in our behaviour to others, then it is due to an intellectual sense of preponderance. But if we remain receptive to all new inputs from all others and all sciences, we will continue to be enabled to persevere in methodological obverambitiousness.
The peer had already decided how the author should write. It seems that the author should have written on all sub-themes within the title a separate book or part in the book....! Or, should he have cited from all sorts of authors on all possible sub-themes in his book in order to be approved by the peer?
Yet another systematically dominative and other-debilitating manner of peers is this: Say, I submit to you the publisher a book. The publisher sends it to the peer/s. Without even taking time for a good reading of the text, the peer suggests some opinions to the publishers, which the publisher relates to the author in a day or two: Your work may be very good, but its title is too broad. An author cannot do justice to the whole breadth of the subject matter!
Have you heard or read psychologists, neuroscientists, medical doctors, etc. discussing some symptoms and their causes? A book in psychology says: ‘According to the bio-psycho-social approach in psychopathology, one mental disturbance CAN have many causes.’ But a person trained and enthusiastic about philosophy (also of the philosophy of the sciences) would wonder why there should not be many causes, at least some of which one could seek to find...! Discovering ‘only the immediate, exact, and unique cause’ is not their work because any reason can tell us that nothing in this world has an exact cause.
This directs our attention to a basic nature of philosophy: Not that a philosopher should only generalize. But a philosopher should study any specific thing only in terms of the most generalizable notions. Here ‘generality’ does not directly indicate only abstraction. It demonstrates the viewpoint that philosophy always takes. Hence, speaking only of the linguistic formulation of notions and arguments, formulating arguments only of life-related events in order to prove general principles that belong to the whole of Reality, etc. are not philosophical. The philosophically trained reader can recognize which recent trends in philosophy I have in mind here.
I may be talking strange things here, especially for those trained mainly in analytic philosophy and the philosophy of science in a narrow manner. If you do not find such suggestions interesting, just ignore this intervention. I continue to work on this. I do have some success. Each of us has our own manner of approaching the problems.
I am aware that I may be laughed at. Since I have left the profession of teaching, I do not lose much. Moreover, getting great publishers is out of reach for me, but that too does not compound to much consequence if eventually one succeeds to do something solid.
Bibliography
(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology, 647 pp., Berlin, 2018.
(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology, 386 pp., Frankfurt, 2015.
(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology, 361 pp., Frankfurt, 2014.
(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology, 92 pp., KDP Amazon, 2022, 2nd Edition.
(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie, 104 pp., KDP Amazon, 2022, 1st Edition.
Relevant answer
Answer
Dear colleagues, I agree with you.
Everyone has the right to have their own philosophy and to form their own picture of the world. However, in this case we should divide all philosophers into four groups:
- everyday philosophers (all our friends, colleagues and close people);
- philosophers "due to circumstances" (philosophers of disciplinary branches of science);
- intellectual philosophers (philosophers who are attracted by the possibility of constructing intellectual hypotheses of amazing beauty);
- and professional philosophers (compilers of general contexts of cognition of the world).
The answer to the question: what does the world really look like? – fundamentally interesting only to a small group of professional philosophers (compilers of general contexts of cognition of the world). I am sure that a person cannot ask a question that he cannot answer. Therefore, in the question: what is the world? – its uniqueness is already being pointed out. However, there may be many interpretations from other groups of philosophers. For example, some philosophers-physicists assume the existence of a plurality of worlds. A beautiful worldview picture! But this picture does not negate the question of what a single world looks like, within which many worlds are located. Probably, Plato's statement that a professional philosopher should lead the state should be considered in this context. Probably, only a philosopher can organize the governance of the state by bringing the philosophical statements of four groups of philosophers (members of society) to a universal law that determines the unity of the world. Therefore, we can look at the world, admire its objects and processes, as well as our thoughts about the world. But we must definitely see a one and only world (objects, processes and our thoughts about them, through the prism of systems transdisciplinary models of universal order, which encode each object, process and our thoughts).
In conclusion, it is important to say about the increased responsibility of a professional philosopher. The area of responsibility of a professional philosopher should not be limited only to the formation of a picture of the world. This area should extend to the following chain of actions: description of the picture of the world; creation of an appropriate concept (basic judgments and axioms); creation of methodology (models of cognition of the world and rethinking its high–threshold problems); description of technological ideas (ways to solve high-threshold problems of nature and society); creation of a method of risk analysis from the implementation of the proposed concepts, methodology, technological ideas (security of the world and society). Simply put, during the changing scenarios of the world order and socio-economic crises, professional philosophers, not professional military, should speak.
You can read about the picture of the unified world and the ways of its cognition in these articles:
Mokiy, V.S., Lukyanova, T.A. (2022). Manifesto for systems transdisciplinarity (2023-2030). Universum: Social sciences, 9(88). https://7universum.com/ru/social/archive/item/14313
Mokiy V.S, Lukyanova T.A. (2022). Prospects of integrating transdisciplinarity and systems thinking in the historical framework of various socio-cultural contexts. Transdisciplinary Journal of Engineering and Science, 13. pp. 143-158. https://doi.org/10.22545/2022/00184
Mokiy, V. S., & Lukyanova, T. A. (2022). Modern transdisciplinarity: Results of the development of the prime cause and initial ideas. Issues in Informing Science and Information Technology, 19, pp. 97-120. https://doi.org/10.28945/4951
Mokiy, V.S., & Lukyanova, T.A. (2022). Sustainable development of nature and society in the context of a systems transdisciplinary paradigm. Transdisciplinary Journal of Engineering & Science, 13, Special Issue on Complex Resilience and Sustainability. Transdisciplinary Perspectives, In G. del Cerro Santamaría (Ed.), 15-35. https://doi.org/10.22545/2022/00192
  • asked a question related to History and Philosophy of Science
Question
23 answers
The Nobel Prize Summit 2023: Truth, Trust and Hope has started today, 24 May 2023. The summit encourages participation. Thus, I have sent an open letter and eagerly anticipate their response. Please comment if the points I have made is adequate enough.
Open Letter to The Nobel Committee for Physics
Is There a Nobel Prize for Metaphysics?
Dear Nobel Committee for Physics,
Among the differences between an established religion, such as Roman Catholicism, and science, is the presence of a hierarchical organization in the former for defending its creed and conducting its affairs. The head of the religious institution ultimately bears responsibility for the veracity of its claims and strategic policies. This accountability was evident in historical figures like John Wycliffe, Jan Hus, and Martin Luther, who held the papacy responsible for wrong doctrines, such as the indulgence scandal during the late Middle Ages. In that context, challenging such doctrines, albeit with the anticipated risk of being burned at the stake, involved posting opposing theses on the doors of churches.
In contrast, the scientific endeavour lacks a tangible temple, and no definitive organization exists to be held accountable for possible misconducts. Science is a collective effort by scientists and scientific institutes to discover new facts within and beyond our current understanding. While scientists may occasionally flirt with science fiction, they ultimately make significant leaps in understanding the universe. However, problems arise when a branch of science is held and defended as a sacred dogma, disregarding principles such as falsifiability. This mentality can lead to a rule of pseudo-scientific oppression, similar to historical instances like the Galileo or Lysenko affairs. Within this realm, there is little chance of liberating science from science fiction. Any criticism is met with ridicule, damnation, and exclusion, reminiscent of the attitudes displayed by arrogant religious establishments during the medieval period. Unfortunately, it seems that the scientific establishment has not learned from these lessons and has failed to provide a process for dealing with these unfortunate and embarrassing scenarios. On the contrary, it is preoccupied with praising and celebrating its achievements while stubbornly closing its ears to sincere critical voices.
Allow me to illustrate my concerns through the lens of relativistic physics, a subject that has captured my interest. Initially, I was filled with excitement, recognizing the great challenges and intellectual richness that lay before me. However, as I delved deeper, I encountered several perplexing issues with no satisfactory answers provided by physicists. While the majority accepts relativity as it stands, what if one does not accept the various inherent paradoxes and seeks a deeper insight?
Gradually, I discovered that certain scientific steps are not taken correctly in this branch of science. For example, we place our trust in scientists to conduct proper analyses of experiments. Yet, I stumbled upon evidence suggesting that this trust may have been misplaced in the case of a renowned experiment that played a pivotal role in heralding relativistic physics. If this claim is indeed valid, it represents a grave concern and a significant scandal for the scientific community. To clarify my points, I wrote reports and raised my concerns. Fortunately, there are still venues outside established institutions where critical perspectives are not yet suppressed. However, the reactions I received ranged from silence to condescending remarks infused with irritation. I was met with statements like "everything has been proven many times over, what are you talking about?" or "go and find your mistake yourself." Instead of responding to my pointed questions and concerns, a professor even suggested that I should broaden my knowledge by studying various other subjects.
While we may excuse the inability of poor, uneducated peasants in the Middle Ages to scrutinize the veracity of the Church's doctrine against the Latin Bible, there is no excuse for professors of physics and mathematics to be unwilling to revaluate the analysis of an experiment and either refute the criticism or acknowledge an error. It raises suspicions about the reliability of science itself if, for over 125 years, the famous Michelson-Morley experiment has not been subjected to rigorous and accurate analysis.
Furthermore, I am deeply concerned that the problem has been exacerbated by certain physicists rediscovering the power and benefits of metaphysics. They have proudly replaced real experiments with thought experiments conducted with thought-equipment. Consequently, theoretical physicists find themselves compelled to shut the door on genuine scientific criticism of their enigmatic activities. Simply put, the acceptance of experiment-free science has been the root cause of all these wrongdoings.
To demonstrate the consequences of this damaging trend, I will briefly mention two more complications among many others:
1. Scientists commonly represent time with the letter 't', assuming it has dimension T, and confidently perform mathematical calculations based on this assumption. However, when it comes to relativistic physics, time is represented as 'ct' with dimension L, and any brave individual questioning this inconsistency is shunned from scientific circles and excluded from canonical publications.
2. Even after approximately 120 years, eminent physicist and Nobel Prize laureate Richard Feynman, along with various professors in highly regarded physics departments, have failed to mathematically prove what Einstein claimed in his 1905 paper. They merely copy from one another, seemingly engaged in a damage limitation exercise, producing so-called approximate results. I invite you to refer to the linked document for a detailed explanation:
I am now submitting this letter to the Nobel Committee for Physics, confident that the committee, having awarded Nobel Prizes related to relativistic physics, possesses convincing scientific answers to the specific dilemmas mentioned herein.
Yours sincerely,
Ziaedin Shafiei
Relevant answer
Answer
I looked at the link you gave which was
In that link I found the statement:
Einstein claimed that “If a unit electric point charge is in motion in an electromagnetic field, the force acting upon it is equal to the electric force which is present at the locality of the charge, and which we ascertain by transformation of the field to a system of co-ordinates at rest relatively to the electrical charge.”
I also get from the above link that you have a disagreement with the above statement. I think the confusion here is about which observer is defining the force. The electromagnetic field as transformed to coordinates at rest relative to the charge is the field needed to predict the force as seen by an observer at rest with the charge (an electric force but no magnetic force because the charge is not moving). Field transformations to other coordinate systems are needed to predict the force as seen by observers moving relative to the charge. This means that different observers (having different motions relative to each other) can see different forces even if all coordinate systems are inertial. This is in contrast to Newtonian mechanics in which the same force is seen in all inertial coordinate systems. Newtonian mechanics is wrong when applied to electromagnetic forces so we need to include things like field energy or field momentum (outside the scope of Newtonian mechanics) to obtain conservation laws. However, I think that your complaint is not that Newtonian mechanics should be used when it isn't, but rather that special relativity is wrong. Special relativity does have limitations (when general relativity becomes an issue) but for its intended applications (i.e., when general relativity is not needed) it has done a great job of producing all of today's modern technology derived from it. In particular, the treatment of electromagnetic forces in the context of special relativity is one of the most thoroughly studied of all topics in physics. If there was a real incompatibility between special relativity and electromagnetism, we would have known about that a long time ago. We would have known about it during the days when special relativity was first introduced and had a lot of opposition, and a lot of people searched very hard to find inconsistencies with the theory. The theory survived attacks by brilliant people searching for problems with the theory, and it will survive attacks by people that perceive it to be wrong because of their own lack of understanding.
  • asked a question related to History and Philosophy of Science
Question
82 answers
Anything exists non-vacuously as in Extension (having parts, each of which again is extended) and in Change (extended existents impacting some other extended existents). Anything without these two characteristics cannot exist. THESE ARE THE TWO, EXHAUSTIVE, IMPLICATIONS OF "TO BE" AND "EXISTENTS".
If not in Change, how can something exist in Extension (= in the state of Extension) alone? And if not in Extension, how can something exist in the state of Change alone? These are impossible. ((The traditional interpretations of Parmenides and Heraclitus as emphasizing merely one of these is unacceptable.)) Hence, Extension-Change are two fundamentally physical-ontological Categories of all existence.
But Extension-Change-wise existence is what we understand as Causality: extended existents and their parts exert impacts on other extended existents. Every part of existents does it. This is not the meaning of Change alone, but in Extension-Change! That is, if everything exists, everything is in Causation. This is the principle of Universal Causality...! All counterfactual imaginations need not yield really existent worlds of this kind.
Even the allegedly “non-causal” quantum-mechanical constituent processes are mathematically and statistically circumscribed, measuremental, concepts from the results of experiments upon Extended-Changing existents; and ipso facto the realities behind these statistical measurements are in Extension-Change if they are physically existent.
Space is the measured shape of Extension; time is that of Change. Therefore, space and time are merely epistemic categories. How then can statistical causality be causality at all? Bayesians should now re-interpret their results in terms of Universal Causality, as mere measuremental extent of our determination of the exact causes of some events.
No part of an existent is non-extended and non-changing. One unit of cause and effect may be called a process. Every existent and its parts are fully processual -- in the sense that every part of it is further constituted by sub-processes. None of these sub-processes is infinitesimal. Each is near-infinitesimal in Extension and Change.
Thus, Extension and Change are the very exhaustive meanings of To Be, and hence I call them the highest Categories of metaphysics, physical ontology, the sciences, etc. Science and philosophy must obey these two Categories if they deal with existent processes, and not merely of imaginary counterfactual worlds.
In short, everything existent is causal. Hence, Universal Causality as the highest pre-scientific Law, second only to Existence / To Be. To Be is not a law; it is the very reason for existence of anything...!
Natural laws are merely derivative from Universal Causality. If any natural law disobeys Universal Causality, it is not a scientific law. Since Extension-Change-wise existence is the same as Universal Causality, scientific laws are derived from Universal Causality, and not vice versa.
Today sciences attempt to derive causality from the various scientific laws! This is merely because millennia long we have been getting fooled about such a fundamental meaning of Causality. We were told that causality can be proved only empirically. The folly here has been that what is specific is universalized: The Fallacy of Conceptual / Theoretical Wholes and Parts. Search for the causes of a few events has been misinterpreted as capable of defining the search for the causal nature or non-causal nature of all...! IS THIS NOT ENOUGH PROOF FOR THE INFANCY IN WHICH THE FOUNDATIONS OF SCIENTIFIC AND PHILOSOPHICAL PRINCIPLES FIND THEMSELVES IN?
The relevance of metaphysics / physical ontology for the sciences is clear from the above. Lack of such a metaphysical foundation has marred the effectivity of the sciences and of course of philosophy as such. RECOLLECT THE ERA IN THE TWENTIETH CENTURY WHEN CAUSALITY WAS CONVERTED TO CAUSAL EXPLANATIONS.........
Existents have some Activity and Stability. This is a fully physical fact. These two categories may be shown to be subservient to Extension-Change. Pure vacuum (non-existence) is the absence of Activity and Stability. Thus, entities are irreducibly active-stable processes in Extension-Change. Physical entities / processes possess finite Activity and Stability. Activity and Stability together belong to Extension; and Activity and Stability together belong to Change too.
That is, Stability is not merely about space; and Activity is not merely about time. But the traditions (in both the sciences and philosophy) still seem to hold so. We consider Activity and Stability as sub-categories, because they are based on Extension-Change, which together add up to Universal Causality; and each unit of cause and effect is a process.
These are not Categories belonging to merely imaginary counterfactual situations. The Categories of Extension-Change and their sub-formulations are all about existents. There can be counterfactuals that signify cases that appertain existent processes. But separating these cases from useless logical talk is near to impossible in linguistic-analytically and denotatively active definitions of reference in logic, philosophy, philosophy of science, and the sciences. THE FAD NATURE OF THE PHILOSOPHIES OF FREGE, WITTGENSTEIN, THE VIENNA CIRCLE, AND THEIR FOLLOWERS TODAY FOLLOWS FROM THIS.
Today physics and the various sciences do something like this in that they indulge in particularistically defined terms and procedures, blindly thinking that these can directly denotatively represent the physical processes under inquiry.
Concerning mathematical applications too this is the majority attitude among scientists. Hence, without a very general physical ontology of Categories that are applicable to all existent processes, all sciences are in gross handicap. THIS IS A GENERAL INDICATION FOR THE DIRECTION OF QUALITATIVE GROWTH IN THE SCIENCES AND PHILOSOPHY.
The best examples are mathematical continuity and discreteness being attributed to physical processes IN THE MATHEMATICALLY INSTRUMENTALIZED SCIENCES. Mathematical continuity and discreteness are to be anathema in the sciences. Existent processes are continuous and discrete only in their Causality.
This is nothing but Extension-Change-wise discrete causal continuity. At any time, causality is present in anything, hence there is causal continuity. But this is different from mathematical continuity and discreteness.
Bibliography
(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology, 647 pp., Berlin, 2018.
(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology, 386 pp., Frankfurt, 2015.
(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology, 361 pp., Frankfurt, 2014.
(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology, 92 pp., KDP Amazon, 2022, 2nd Edition.
(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie, 104 pp., KDP Amazon, 2022, 1st Edition.
Relevant answer
Answer
Universal causality (1) and existential being (2) as dynamic interplay, with extra-temporal causality as a priori assumption; our statistical observations can only catch temporal conditionalities as a posteriori phenomena, in terms of creating scientific models of reality, which may work and serve us somehow, to cope with reality. JvNeumann opined that our scientific models cannot explain reality; this explanation gap remains the domain of philosophy (theology included here), where time-tested wisdom rules over the tradable commodities of data, information and knowledge.
Shallow men believe in luck or in circumstance. Strong men believe in cause and effect. R.W. Emerson
  • asked a question related to History and Philosophy of Science
Question
20 answers
One of the central themes in the philosophy of formal sciences (or mathematics) is the debate between realism (sometimes misnamed Platonism) and nominalism (also called "anti-realism"), which has different versions.
In my opinion, what is decisive in this regard is the position adopted on the question of whether objects postulated by the theories of the formal sciences (such as the arithmetic of natural numbers) have some mode of existence independently of the language that we humans use to refer to them; that is, independently of linguistic representations and theories. The affirmative answer assumes that things like numbers or the golden ratio are genuine discoveries, while the negative one understands that numbers are not discoveries but human inventions, they are not entities but mere referents of a language whose postulation has been useful for various purposes.
However, it does not occur to me how an anti-realist or nominalist position can respond to these two realist arguments in philosophy of mathematics: first, if numbers have no existence independently of language, how can one explain the metaphysical difference, which we call numerical, at a time before the existence of humans in which at t0 there was in a certain space-time region what we call two dinosaurs and then at t1 what we call three dinosaurs? That seems to be a real metaphysical difference in the sense in which we use the word "numerical", and it does not even require human language, which suggests that number, quantities, etc., seem to be included in the very idea of ​​an individual entity.
Secondly, if the so-called golden ratio (also represented as the golden number and related to the Fibonacci sequence) is a human invention, how can it be explained that this relationship exists in various manifestations of nature such as the shell of certain mollusks, the florets of sunflowers, waves, the structure of galaxies, the spiral of DNA, etc.? That seems to be a discovery and not an invention, a genuine mathematical discovery. And if it is, it seems something like a universal of which those examples are particular cases, perhaps in a Platonic-like sense, which seems to suggest that mathematical entities express characteristics of the spatio-temporal world. However, this form of mathematical realism does not seem compatible with the version that maintains that the entities that mathematical theories talk about exist outside of spacetime. That is to say, if mathematical objects bear to physical and natural objects the relationship that the golden ratio bears to those mentioned, then it seems that there must be a true geometry and that, ultimately, mathematical entities are not as far out of space-time as has been suggested. After all, not everything that exists in spacetime has to be material, as the social sciences well know, that refer to norms, values or attitudes that are not. (I apologize for using a translator. Thank you.)
Relevant answer
Answer
Indeed, that is a possibility. Perhaps what we call numbers are labels in a language, as a kind of names that do not really name anything that is literally beyond human language and representations, or that are a way of referring to systems, scales , etc. of which they are a part, mere nodes of a conceptual structure. Some authors have argued that numbers are only signs, signs that are part of representational and notational systems that have proven to be effective, useful instruments to be applied to parts of reality, which are improved and refined over time. However, I believe that it is necessary to take into account the fact that not every system, model or scale works, and this perhaps reveals that there are structural characteristics of the reality to which they are applied that are imposed as limits, that constrain what can be work and what doesn't, and this perhaps means that, although they do not literally describe abstract entities (numbers or geometric figures, for example) as we imagine them, mathematical systems and theories somehow express that which is beyond the representations themselves. You can't use just any geometry to build a house or to explain why Mercury "wobbles" when it's at perihelion, and that suggests that mathematical systems, mathematized theories and models are human creations but they could not be totally arbitrary, so that, even in a metaphorical or indirect way, it should not be ruled out that they represent structural characteristics of the world to which they are applied that is beyond human constructs. We must not forget that we humans perceive in three dimensions, we listen less than dogs, we believe that colors are in things and, to an important extent, we elaborate our theories and build our image of the world accordingly ("the human is the measure of all things" said Protagoras), but there seems to be more and more evidence that, at least the macroscopic physical world is not three-dimensional, so we may never really know what lies beyond us and our representations and to that mystery we must add that of why some models and mathematical theories work and others do not. Greetings.
  • asked a question related to History and Philosophy of Science
Question
5 answers
is there a historical map of academic disciplines? what is the trend of academic disciplines changes(number, nature and label of disciplines)?
i will be thanks full if someone introduce any article, book, handbook or report about historical map of disciplines and history of academic disciplines.
Relevant answer
Answer
Hello,
Yes, a historical mapping of academic disciplines now exists: the "Interactive Historical Atlas of the Disciplines". This website, recently launched at the University of Geneva, is available in open access here:
It is an interactive atlas containing a collection of more than 200 disciplinary maps ("classifications of the sciences" or "knowledge maps") from Antiquity to our time, with thousands historical definitions of academic disciplines extracted from sources. Moreover, it includes several analysis tools (timeline, statistical tools with the ability to chart the evolution of a discipline, iconographic database, advanced search filters). For instance, it is possible to display chronologically a list of historical definitions of an academic discipline in order to study the evolution of its identity over time. The aim of this project is to map the evolution of the disciplinary borders throughout the centuries, and to reconstruct the genealogy of the sciences.
As for a "typology" of the various types of disciplinary systems (namely, the different taxonomic systems underlying these historical "classifications of the sciences"), you could find some insights in my recent paper:
Sandoz, Raphaël (2021), "Thematic Reclassifications and Emerging Sciences", Journal for General Philosophy of Science 52(1), pp. 63–85, available here: https://doi.org/10.1007/s10838-020-09526-2.
Best regards
  • asked a question related to History and Philosophy of Science
Question
30 answers
What kind of scientific research dominate in the field of Philosophy of science and research?
Please, provide your suggestions for a question, problem or research thesis in the issues: Philosophy of science and research.
Please reply.
I invite you to the discussion
Thank you very much
Best wishes
Relevant answer
Answer
Problem of what counts as a good scientific explanation... Salmon, W. C. (1984). Scientific explanation and the causal structure of the world. Princeton University Press.
  • asked a question related to History and Philosophy of Science
Question
219 answers
1) There is some tradition in philosophy of mathematics starting at the late 19th century and culminating in the crisis of foundations at the beginning of the 20th century. Names here are Zermelo, Frege, Whitehead and Russel, Cantor, Brouwer, Hilbert, Gödel, Cavaillès, and some more. At that time mathematics was already focused on itself, separated from general rationalist philosophy and epistemology, from a philosophy of the cosmos and the spirit.
2) Stepping backwards in time we have the great “rationalist” philosophers of the 17th, 18th, 19th century: Descartes, Leibniz, Malebranche, Spinoza, Hegel proposing a global view of the universe in which the subject, trying to understand his situation, is immersed.
3) Still making a big step backwards in time, we have the philosophers of the late antiquity and the beginning of our era (Greek philosophy, Neoplatonist schools, oriental philosophies). These should not be left out from our considerations.
4) Returning to the late 20th century we see inside mathematics appears the foundation (Eilenberg, Lavwere, Grothendieck, Maclane,…) of Category theory, which is in some sense a transversal theory inside mathematics. Among its basic principles are the notions of object, arrow, functor, on which then are founded adjunctions, (co-)limits, monads, and more evolved concepts.
Do you think these principles have their signification a) for science b) the rationalist philosophies we described before, and ultimately c) for more general philosophies of the cosmos?
Examples: The existence of an adjunction of two functors could have a meaning in physics e.g.. The existence of a natural numbers - object known from topos theory could have philosophical consequences. (cf. Immanuel Kant, Antinomien der reinen Vernunft).
Relevant answer
Answer
There is a view that if mathematical categories are kinds of mathematical structure, then what is important mathematically are the functors from one category to another, because they provide a means of find a neat way of discovering a new property in a category by translating proofs in another category. This is a way of formalising reasoning by "analogy". Personally I find reasoning about categories as abstract algebras difficult and unintuitive, and find it much easier to look at a concrete realisation of a category than considering a category with a list of pre-defined desirable properties; but I recognise that that is a matter of learning preferences.
  • asked a question related to History and Philosophy of Science
Question
35 answers
Is "Quantization of Time" theory possible ?
According to science Time is a physical parameter but according to philosphy it is an illusion . How can we define Time ? Can we quantize illusions?
Relevant answer
Answer
What about George Musser's understandings: I will not go into detail as it would take too long.
Using a procedure called canonical quantization to turn Theory of Relativity into a quantum theory physicists found it worked with the theory of electromagnetism, but in relativity produces an equation. The Wheeler-DeWitt equation, which does not have a Time variable. This equation predicts that the universe is frozen.
Some think that this challengers the observer principle. Two observers will have a different perception of spacetime, perceived geometrically, that concerns who is moving and forces acting. Normally, logically, whatever the shapes they should be physically equivalent.
This then involved or involves substantivalism-that is space and time independent of stars, etc. Or is time and space artificial methods of talking about relationism? (The illusion theory).Apparently, Einstein gave thought to an independent space/time, which I and others here have also promulgated. Einstein rejected this because like quantum theory it contains an element of randomness. An independent spacetime doesn't fit the deterministic approach that Einstein approved of, therefore he abandoned it.
So relationism sees time/space, but really time, as an illusion.
George Musser: Spooky Action at a Distance (2015).
  • asked a question related to History and Philosophy of Science
Question
164 answers
Hawking's Legacy
Black hole thermodynamics and the Zeroth Law [1,2].
(a) black hole temperature: TH = hc3/16π2GkM
The LHS is intensive but the RHS is not intensive; therefore a violation of thermodynamics [1,2].
(b) black hole entropy: S = πkc3A/2hG
The LHS is extensive but the RHS is neither intensive nor extensive; therefore a violation of thermodynamics [1,2].
(c) Black holes do not exist [1-3].
Hawking leaves nothing of value to science.
REFERENCES
[1] Robitaille, P.-M., Hawking Radiation: A Violation of the Zeroth Law of Thermodynamics, American Physical Society (ABSTRACT), March, 2018, http://meetings.aps.org/Meeting/NES18/Session/D01.3
[2] Robitaille, P.-M., Hawking Radiation: A Violation of the Zeroth Law of Thermodynamics, American Physical Society (SLIDE PRESENTATION), March, 2018, http://vixra.org/pdf/1803.0264v1.pdf
[3] Crothers, S.J., A Critical Analysis of LIGO's Recent Detection of Gravitational Waves Caused by Merging Black Holes, Hadronic Journal, n.3, Vol. 39, 2016, pp.271-302, http://vixra.org/pdf/1603.0127v5.pdf
Relevant answer
Answer
Well, to put it on a more concrete foundation, here's my view on his scientific achievement, not exhaustive, as I don't think I am entitled to judge on Hawking's scientific legacy.
His works on black hole theory are from about 50 years ago, and I would consider the singularity theorems he proved together with Roger Penrose quite the highlight of his scientific career. In a nutshell, what they say is that black hole creation takes place under very general conditions in space-time and is a necessary consequence of ART, and does not require very special, e.g. highly symmetric conditions.
With his work on Hawking radiation from teh mid-70s he applied semiclassical analysis to ART which paved the way to a more thorough treatment of quantum field theory on curved space/spacetime.
Although his scientific highlights might stem back from the 60s and 70s, I would nevertheless stress that his legacy surely comprises all that he did as an ambassador to science, as it seems. He surely was someone who gave inspiration to at least a complete generation of scientists many man years ago, his publicity starting to spread with the little booklet he wrote end of the 80s: "A Brief History of Time". I would never underestimate the importance of lighthouse figures like him with this regards, even though the hard-core scientific hightime had then already been past.
  • asked a question related to History and Philosophy of Science
Question
25 answers
How can anyone absolute determine by inconclusive accepting "that since we observe so many stars and planets WE AND EARTH are nothing SPECIAL". I submit, this is a false pseudo philosophical and pseudo science childish assumption and ill based axiom, made 500 years ago and surprisingly adapted by modern science although all scientific evidence today proves otherwise.
We are not SPECIAL?! Really?...How any serious scientist can say that today. If we are not special what is then the scientific evidence of other life existing in the universe?
Our location in the universe is nothing SPECIAL!...Really?.... so why then it is verified by two separate satellite launches science experiments so far that the "axes of evil" are true in the cosmic microwave background (CMB) image of the universe? https://en.wikipedia.org/wiki/Axis_of_evil_(cosmology)
Relevant answer
Answer
“… however many millions of suns and earths may arise and pass away, however long it may last before the conditions for organic life develop, however innumerable the organic beings that have to arise and pass away before animals with a brain capable of thought are developed from their midst, and for a short span of time find conditions suitable for life, only to be exterminated later without mercy, we have the certainty that matter remains eternally the same in all its transformations, that none of its attributes can ever be lost, and therefore also, that with the same iron necessity that it will exterminate on the earth its highest creation, the thinking mind, it must somewhere else and at another time again produce it.” Frederick Engels, Dialectics of Nature.
  • asked a question related to History and Philosophy of Science
Question
43 answers
First there was philosophy (φίλος - philo - friend, sophy- σοφία - - wisdom), thus friend of knowledge and the seeker of truth and knowledge everywhere, and then came science. This is a long forgotten truth, and many scientists today turn against philosophy and into mockery of it.
THE TRUTH HOWEVER IS PHILOSOPHY IS THE MOTHER OF ALL SCIENCES THERE ARE TODAY IF THEY LIKE IT OR NOT YOU CAN NOT DENY IT OR ELSE YOUR LOGIC IS FLAWED...
The main reason why many scientists deny like Hell philosophy today and don't regard it as a science is it's branch of religion and theology, although it is the mother of all sciences and a a science discipline today.
Is it not whatever field of knowledge and information man studies exhaustively and systematically using his higher mental powers and logic tools for the betterment of human civilization, CALLED SCIENCE?!! Isn't it? Is this not the definition of science and its disciplines?
You see there is out there a stupid believe (propaganda ? ) that you can not have faith without science (and in many cases they mean their science). So yes there is a cabal and trend out there today to discourage any scientist to be a believer of GOD almighty. In order to be "a good scientist you must be an atheist" and they promote this unacceptable position whenever they can through media... we see this everyday.
However, these people and GOD faith deniers are in error, and not real wise people and therefore not really philosophers and thus ironically behave unscientific.
Greek philosophers 2,000 years ago knew that you can not get apart faith form logic and the one drives the other. They are complementary.
Denying faith is the same thing as closing the door to logic.
So, for example, what is the difference really of telling to someone that GOD snapped his fingers and everything came out... From, suddenly out of nowhere a big bang created everything?... They just replaced the word God with Big Bang! . You can't deny that....
So ATHEISTS scientists are actually ironically faith believers and actually religious THEMSELVES... THEIR RELIGION! they believe ultimately in something (e.g. Big Bang theory) they can not prove and therefore are also driven by faith as the rest.
So there are not really atheists.... no one is!
So far no problem.. However, the problem arises and got much worst the last decade when these allegedly "atheists scientists" through media and other means attack to GOD faith believer fellow scientists, and try to enforce and push down the throat THEIR BELIEVES to the general public.
So this is NOT called any more atheism (not believing in a higher entity - God deniers) BUT this is USUALLY CALLED RACISM AND DISCRIMINATION!...
THIS HAS TO STOP NOW... AND STOP THESE SCIENTISTS OF MAKING PUBLICLY A mockery of themselves and science by behaving more like PRIESTS THAN SCIENTISTS!!
Science disciplines should really concentrate on their study and leave the religion matters of each fellow scientists alone not interfering and spreading propaganda.
The only science discipline allowed to study faith is philosophy and its immediate branches.
Let's treat among us scientists as normal people don't openly criticize and act against each others faith or no faith, and restrict their freedom, thus, their faith which means logic.
Thank you for letting me express this Philosophical Question and pointing out a modern dangerous phenomena of science that potentially and in the long run harms scientific progression and can turn science into dogma.
Emmanouil.
Relevant answer
Answer
Unfortunately, many scientists have bought into the idea that there is a long war between science and religion, which now presumably extends to philosophy because there are good philosophical arguments for God's existence which use scientific premises (like "The universe began to exist" and "The universe's initial conditions and quantities fall into an extraordinary narrow life-permitting range"). So these scientists see themselves in a turf war with philosophy. However, scholars who work on the science-religion dialogue recognize that the war model is revisionist history introduced at the end of the nineteenth century by John William Draper and Andrew D. White. Historically, science and philosophy have not been enemies and, to do science well, one needs to understand its philosophical underpinnings. Philosophy--especially logic--provides the turf on which science is carried out, regardless of whether or not scientists realize this fact.
  • asked a question related to History and Philosophy of Science
Question
27 answers
I have posted a comment on André Orléan's "open letter" to the French Minister of Education (See the first answer below of my own). The letter and the comments on background explain what is happening in France in the field of economics education. In the comment, I have mentioned what had happened in Japan. An e-mail I have received this morning tells that similar dispute is repeated in University College London.
At the bottom of all arguments, there lies a problem how to interpret the status of neoclassical economics. The neoclassical economics occupies now a mainstream position and is trying to monopolize the economics education and academic posts, whereas various heterodox economists are resisting to the current, claiming the necessity of pluralism in economics education and research.
I have mentioned cases of three countries. There must be many similar stories in almost all countries. It would be wonderful if we can know all what is happening in other countries. So my question is:
What is happening in your country?
Relevant answer
Answer
The following is a comment I have added to André Orléan's new uploaded article: Madame Najat VALLAUD-BELKACEM Souhaitez-vous vraiment la fin du pluralisme en économie ?
********************************************
This is an important letter. If this is an open letter, I wonder why this letter is closed to a third party. At least, André Orléan should indicate where we can get this important petition. I have read this Open Letter at
This is a petition from André Orléan, as president of President de l’AFEP (Associassion Française d'Economie Politique) to the French Minister of Education Mme Najat Vallaud-Belkacem asking to conserve (or create) the pluralism in economic education and research. Almost all countries are facing a strong movement that claims to "modernize" or "standardize" or "make more efficient" the economics education and thus to establish a more unified and uniform economics in education and research.
The origin of this movement may be traced back to various reasons. Each nation must have its own history.
In the case of Japan, a controversy started when the Japan Science Council established Working Group of Economics Section for Preparing the Standard References for the Economics Education at the end of 2012. A first draft of the Standard References was made open in June, 2013 and ignited a feverish argument among economists. The first draft was written totally in the neoclassical economics view. It cited Lionel Robins's the famous definition of economics: "Economics is the science which studies human behaviour as a relationship between ends and scarce means which have alternative uses." (Robbins, 1932,An Essay on the Nature and Significance of Economic Science, p. 15).
Many academic associations including Japan Association of Political Economy (JAPE), Japan Association for Evolutionary Economics (JAFEE), Japan Association for Economics Education (JAEE), Socio-Economic History Society Japan (SEHSJ), Japanese Society for the History of Economic Thought (JSHET) and 7 other associations expressed their concern and objected to the first draft. An open symposium to discuss the question was organized in December 2014. The final version of the Standard References was decided and made public in August, 2015. In my understanding, the working group members conceded much and its expressions became more ambiguous and obscure, but what they aimed did not changed much. A book The Future of Economics and Economics Education (in Japanese) was published in April, 2015, which contained a summary of the controversy and papers of various opinions from the opponent side. I also contributed a chapter. The main point of dispute was how to evaluate plurality in economics education.
The case of France has a totally different history. Already in 2000, when the French Ministry of Education tried to set more uniform system of economics education in universities, a strong movement of economics students erupted and many students asked a more diversified education. Aided by researchers, they formed a movement they named Post-Autistic Economics. It was then re-organized as Real-World Economics and now one of active activity center of heterodox economics. In 2009, a group of social scientists including not only economists but sociologists organized an AFEP. This group asked the Ministry of Education to admit a new section, named Economics and Society, which was to be the 78th of the National University Committee (Comité National des Universités). To under stand the whole history it is necessary to know the French university system. It is highly centralized and very different from other countries where the university autonomy is much more established.
See for the background a document by AFEP and an article in a newspaper Libération:
Some contextual and background information relevant to understanding the issue
Battle for Influence among French economists (in French) by Frantz Durupt Februrary 1, 2015.
The Ministry was once prone to admit the creation of the new section. At that time Jean Tirole, the winner of 2014 Nobel prize in economics, tried to hinder the creation of new section in a letter to the Minister, pointing that selection of university researchers and professors should be made from the international standard.
See Jean Tirole's letter to the Minister
It took a form of thank you letter for the presence of the Minister at the Awarding Ceremony of Nobel Prizes. Tirole expressed his concern with regards to the creation of the new section that AFPE worked for years. Tirole pointed that the economics scientist community should have a world standard that is based on international reputation. He opposed that French government admit two different communities in the same discipline called economics. He even denounced AFPE economists as obscurantists. It was an explicit objection to the creation of the new section and the pluralist strategy in economics. André Orléan's paper is one of responses to Tirole's intervention.
I am not sure if it is good to create a new section Economics and Society in or at the side of economics, because it may help mainstream economics to keep its status quo. But it seems many French economists are thinking that this is the unique way to rescue heterodox economics from the actual dominance of economic sciences. The AFPE cites a statistical research result: Only 6 professors were affiliated to minority schools of thought among 120 professors appointed between 2005 and 2011. The crisis of economics is much more acute in France than in many other countries.
As I have put it above, the academic situation of economics is very different from country to country but we face a common situation in essence. What is the best way to design economics education in the undergraduate and graduate courses? This is the problem that all economists must think about. Supporters of pluralism and heterodox economics should explain that their claim is justified and necessary instead of being expression of their simple desire to keep their posts. Those economists who believe in the future of mainstream economics should show that, despite of all condemnations on their economics, it is not only sane and right but also the unique way to the future development of the economics and pluralism is only an unnecessary waste of intellectual resources.
This is not only an academic dispute inside of the ivory tower but an actual problem that will influence the future of our economy, because the economic policy is strongly influenced by the state of economic science and thought. In this sense, this is the problem that all policy makers and even common people in the town should be concerned with.
To think of this question requires a deep understanding of economics science and the real history of development of economics. On this point, I want to point @David Ellerman's paper: Parallel experimentation.
The main purpose of this paper is not the study of economics but I believe it gives us a necessary framework to consider how a science like economics evolves. It gives us a persuasive reasoning why we need pluralism in complex science as economics.
*******************************************
  • asked a question related to History and Philosophy of Science
Question
65 answers
Why or why not?
Some philosophers maintain that science is morally neutral, while other philosophers maintain that science produces morality.
Relevant answer
Answer
Absolutely! I just finished opining that an occasional glass of wine is actually beneficial, as opposed to drinking alcohol being considered "a sin." And the reason for making such a bold statement is scientific evidence. As in the link attached here.
I would argue that nothing that provides health benefits can be considered "immoral." Abuse of anything, on the other hand, is detrimental to health, so a culture may be justified in making such activity "immoral" in their code of ethics.
Plenty of references in the New Testament of "non-sinful" wine consumption, and science can explain why. And no, gimme a break, that wasn't grape juice!
  • asked a question related to History and Philosophy of Science
Question
6 answers
Or at least use the sentence waves above waves. If you can provide the source that would be great.
Relevant answer
Answer
1) Not exactly an internal wave, but the two-directional current system in the Bosphorus has been known for centuries.  Surface waters flow from the Black Sea to the Sea of Marmara, and bottom (much more saline) waters flow from the Marmara to the Black Sea at the same time.  Fishermen wanting a "free ride" to the Black Sea against the surface current would lower their nets to catch the lower flow.
2)  Fridjtof Nansen had a research ship named the "Fram", which may be the one that Dennis Mazur is referring to above.
  • asked a question related to History and Philosophy of Science
Question
23 answers
Einstein’s geometrodynamics considers 4-D spacetime geometry whose curvature is governed by mass. But the FLRW universe considers a 3-D space of curvature k (+ve, zero, or –ve) with time as an orthogonal coordinate. Hence it seems, the standard cosmology based on the FLRW space time tracked off the stated essence of general relativity.
Relevant answer
Answer
Of course not-and the statement made about the FLRW metric is incorrect-it describes a Lorentzian manifold, not just a  three-dimensional manifold. And the spacetime geometry isn't defined just by the spatial integral of the time-time component of the energy-momentum tensor, but by all components thereof. 
The FLRW metric is written using  a particular choice of coordinates, that's all. What is of relevance isn't the metric, which transforms in a particular way under general coordinate transformations, however, but quantities that are invariant under such transformations-it can be shown that such quantities exist. All this is well known, described in all textbooks and courses on general relativity and doesn't have anything to do with the physical applications of the FLRW metric, as a particular solution of Einstein's equations. It's the other way around-since it can be shown that it is, indeed, a well-defined solution of these equations, i.e. that it does satisfy equations of motion and constraints, it makes sense to study its implications for physics, in this case, cosmology. So it would be useful to study a textbook on general relativity.
And the cosmology in question is classical, not quantum.
  • asked a question related to History and Philosophy of Science
Question
2 answers
Between the end of XIX Century and the beginning of XX there was a French teacher in Macau, he (or she) was theaching art at the Academy of natual science (格致书院). Somebody knows how was?
Thanks a lot!
Relevant answer
Answer
 No record it seems. Only one Mai La Finnish-American actress is in reference
  • asked a question related to History and Philosophy of Science
Question
63 answers
Schrödinger self adjoint operator H is crucial for the current quantum model of the hydrogen atom. It essentially specifies the stationary states and energies. Then there is Schrödinger unitary evolution equation that tells how states change with time. In this evolution equation the same operator H appears. Thus, H provides the "motionless" states, H gives the energies of these motionless states, and H is inserted in a unitary law of movement.
But this unitary evolution fails to explain or predict the physical transitions that occur between stationary states. Therefore, to fill the gap, the probabilistic interpretation of states was introduced. We then have two very different evolution laws. One is the deterministic unitary equation, and the other consists of random jumps between stationary states. The jumps openly violate the unitary evolution, and the unitary evolution does not allow the jumps. But both are simultaneously accepted by Quantism, creating a most uncomfortable state of affairs.
And what if the quantum evolution equation is plainly wrong? Perhaps there are alternative manners to use H.
Imagine a model, or theory, where the stationary states and energies remain the very same specified by H, but with a different (from the unitary) continuous evolution, and where an initial stationary state evolves in a deterministic manner into a final stationary state, with energy being continuously absorbed and radiated between the stationary energy levels. In this natural theory there is no use, nor need, for a probabilistic interpretation. The natural model for the hydrogen, comprising a space of states, energy observable and evolution equation is explained in
My question is: With this natural theory of atoms already elaborated, what are the chances for its acceptance by mainstream Physics.
Professional scientists, in particular physicists and chemists, are well versed in the history of science, and modern communication hastens the diffusion of knowledge. Nevertheless important scientific changes seem to require a lengthy processes including the disappearance of most leaders, as was noted by Max Planck: "They are not convinced, they die".
Scientists seem particularly conservative and incapable of admitting that their viewpoints are mistaken, as was the case time ago with flat Earth, Geocentrism, phlogiston, and other scientific misconceptions.
Relevant answer
Answer
Hello Enders
You state that "According to Schrödinger 1926, there are no quantum jumps." Please allow me the following comments.
A set of articles by various authors are collected in a book edited by Wolfgang Pauli
Pauli, W. (ed.) - Niels Bohr and the Development of Physics. Pergamon Press, London. 1955.
Among the articles there is one by Werner Heisenberg
The Development of the Interpretation of the Quantum Theory
The following lines can be found in the article (page 14 of the book)
At the invitation of Bohr, Schrodinger visited Copenhagen in September, 1926, to lecture on wave mechanics. Long discussions, lasting several days, then took place concerning the foundations of quantum theory, in which Schrodinger was able to give a convincing picture of the new simple ideas of wave mechanics, while Bohr explained to him that not even Planck's Law could be understood without the quantum jumps. Schrodinger finally exclaimed in despair:
"If we are going to stick to this damned quantum-jumping [verdammte Quantenspringerei], then I regret that I ever had anything to do with quantum theory,"
to which Bohr replied:
"But the rest of us are thankful that you did, because you have contributed so much to the clarification of the quantum theory."
May be the above paragraph is the ultimate source of your statement.
The displeasure shown by Schrodinger has a different interpretation. It may mean that he understood quantum jumps, that he had a clear picture of the reach of the Schrodinger time dependent equation (STDE), and in particular that STDE contradicted quantum jumps. Therefore he knew that something very fundamental was missing in his elegant STDE. Nowhere he said something equivalent to "quantum jumps do not exist". He was annoyed by having to accept the existence and crucial phenomenological role of quantum jumps for the description of the basic atomic phenomena of absorption and radiation.
If you have a different historical source to justify your interpretation please share with us the reference as it would be extremely interesting
With most cordial regards,
Daniel Crespin
  • asked a question related to History and Philosophy of Science
Question
25 answers
[I had heard of the Know-Nothing Party, but apparently the internet tells me that that was a disclaimer used by members of what became the American Party, which was anti-immigration in the mid-nineteenth century ... another area of discussion, though proponents today may often fall into the category for discussion here as well, but that is still a bit out-of-scope for this discussion.]     
For historians and other history buffs out there, and those interested in current events, what do you see as the path that has been taken to arrive at popular anti-intellectual, anti-science views in politics?  The rejection of some members of the US House of Representatives with regard to correction of (US) census undercounts - the rejection of sampling statistics - comes to mind, in addition to the usual comments on climate change.
And are there any similar anti-intellectualism movements to be found in history anywhere in the world, including ancient history, which anyone would care to share?   Can you draw any parallels? 
Reasoned comments and historical evidence are requested.  I do not intend to make further comments but instead wish to hear what applicable history lessons you may find interesting regarding this topic.
Thank you. 
Relevant answer
Answer
Such is the cost of development in highly industrialized countries - the lowering of the average level of public education and ethical standards, consumer and selfish lifestyle, alienation of individuals and social groups, public hypocrisy, callousness society, "the rat race", the powerful role of money in politics and everyday life.
  • asked a question related to History and Philosophy of Science
Question
126 answers
Has the experimental science got limits in its discipline? Many actual knowledges are not consequence of repetitive experiments. Regarding the sources of science, are they limited to experimentation? Other disciplines as history, unique experiences, philosophy, etc., can they be more important for the man?
Relevant answer
Answer
The theory of General Relativity was inspired by pure imagination (affected by philosophy) followed by mathematical formulation and then experimental validation.
  • asked a question related to History and Philosophy of Science
Question
6 answers
In my studies many years ago, i came across the very influential thinker alexander bain. Most of his ideas are obsolete today, i know, but he was still an extremely influential person. I skimmed through his autobiography once, but i could not find any study on him by a modern scholar which could place him in a historical perspective. I thought this was odd, considering who he was. 
Does anyone know if there are any standard works on bain? It didn't pop up on amazon.
Relevant answer
Answer
Bain is a central figure in the following work, but I think it focuses on his work and not his life:
Rylance, Rick.
Victorian psychology and British culture, 1850-1880 / Rick Rylance.
Oxford ; New York : Oxford University Press, 2000.
  • asked a question related to History and Philosophy of Science
Question
16 answers
Language, as an expression of the various 'knowledge' is subject to continuous transformations. I’d like to focus in particular on one of them in the field of scientific research.
As science can not critically verify its own assumptions, it is up to history, epistemology, philosophy and to the analysis of language to deepen the horizons of pre-understanding of each scientific proposition. In particular this is the understanding of a reality based on the assumption and tradition of antecedent interpretations, which precedes the direct experience of reality itself.
Popper was very attentive about  the instrumental aspect of science (and therefore also to language), not interested in things in themselves, but to their verifiable aspects through measurements. Therefore, he invited not to interpret theories as descriptions or using their results in practical applications. He recalled that, as "knowledge", science is nothing but a set of conjectures or highly informative guesses about the world, which, although not verifiable (i.e. such that it is possible to demonstrate the truth) they can be subjected to strict critical controls.
This is evident from various texts and  Popper emphasized these ideas in ‘The Logic of Scientific Discovery’: "Science is not a system of certain assertions, or established once and for all, nor is it a system that progresses steadily towards a definitive state. Our science is not knowledge (episteme): it can never claim to have reached the truth, not even a substitute for the truth, as probability .... "
We do not know, we can only presume. Our attempts to conceit are guided by the unscientific belief, metaphysical in the laws, in the regularities that we can uncover, discover.
A kind of approach which is not exempt from ethical questions because the operation has fluid boundaries. The borders can be crossed, leading to the possibility of manipulation and abuse of power against the same identity and autonomy of the persons involved.
As Bacon we could describe our contemporary science - the method of reasoning that today men routinely apply to Nature - consisting of hasty advances, premature and of prejudices. But, once advanced, none of our advances is supported dogmatically. Our research method is not what is to defend them, to prove how right we were; on the contrary, we try to subvert them, using all the tools of our logical, mathematical and technical ‘baggage’".
Hence the maximal caution: "The old scientific ideal of episteme, of absolutely certain and demonstrable knowledge, has proved an idol.
The need for scientific objectivity makes it inevitable that every assertion of Science remains necessarily and forever to the status of an attempt. The wrong view of science is betrayed because of its desire to be the right one. Since it is not the possession of knowledge, of irrefutable truth, that makes a man of science, but the critical research, persistent and anxious for the truth ".
[In this regard I consulted the following texts: H. R. Schlette, Philosophie, Theologie, Ideologies. Erläuterung der Differenzen, Cologne, 1968 (Italian transl c / o Morcelliana, Brescia, 1970, pp. 56, 78); G. Gismondi, The critique of ideology in the science foundation's speech, in "Relata Technica", 4 (1972), 145-156; Id., Criticism and ethics in scientific research, Marietti, Torino, 1978].
Then, Hermeneutics, applied to language, to human action and ethics allows to articulate text and action. An action may be told because it is the human life itself that deserves to be narrated; it presents possible narrative paths that the individual highlights, excluding others. Story and action also confirm the inter-subjectivity dimension of human beings: the action can be told because it is the same human life that deserves to be told. The story presents thoroughly the three moments of ethical reflection: describe, tell and prescribe.
Relevant answer
Answer
I just put my book up on Amazon: Give Space My Love: An Intellectual Odyssey with Dr. Stephen Hawking. The brief book description is below.
If any of you would like a complementary copy of the book just send me an email with your physical address. bristol at isepp.org
Per your starting question it focuses of whether science and talk about science. My background is philosophy of science: Popper, Lakatos and Feyerabend. The last was my honors advisor at Berkeley.
The central narrative tension uses Dewey distinction between the Spectator and the Participant representation of inquiry (and the place of inquiry in the universe). Quantum Mechanics and Relativity both force us to (toward) a Participant framework. I argue that there were two paths to complementarity (and the limits of the scientific research program) in the 20th century, one is in the new physics and the other comes from Popper's Question (about falsifiability of all meaningful theories).
Personally I have transitioned from philosophy of science to philosophy of engineering – 'a new name for an old way of thinking' (viz. James's remark about pragmatism). I have a couple of Linus Pauling Memorial Lectures on YouTube if you are curious about where all this goes beyond the book. 
Freewill and the Engineering Worldview
Bristol May 3rd, 2013  http://youtu.be/kZjJukntqHM
Life Ascendant: A Post-Darwinian Worldview
Bristol May 21st  2014  http://youtu.be/i2mwhk-6a3A
What is Engineering? What is the Value Context of Engineering?
What is the Value Context of Engineering?
Bristol July 30th 2015 (China) https://www.youtube.com/watch?v=vc1lI8Ox7qM
BOOK DESCRIPTION:
Who is the real Dr. Stephen Hawking? Is he a detached Spectator seeking a mathematical description of a deterministic, objective reality – ‘out there’? Or is he an embodied Participant in the universe seeking to bring about a more desirable future? The timeline of the book is a four-city lecture tour the author organized for Hawking in the early 1990s (Portland, Eugene, Seattle and Vancouver BC). Hawking’s powerful meetings with students with disabilities, officially collateral events, were remarkable. However, the greater significance of these ‘stories of the road’ is better appreciated in the context of the central narrative question of the book: the nature of the universe and our place/role in it.
The author, a philosopher of science (Berkeley, London), engages Hawking, his graduate assistants and eventually his nurses in what starts as a critical review of the ‘new physics’ of Einstein, Bohr and Heisenberg. The question of the limits of classical science expands to questions of the limits of all supposedly objectivist, ‘one right answer’ ideologies – in biological, socio-economic, and political realms. Is everyone ‘really’ selfish? Is the world objectively competitive or cooperative? In a parallel critical review of the ‘new philosophy of science’ the contributions of the author’s mentors, Feyerabend, Lakatos, Kuhn and Popper mark a parallel path to complementarity, undermining the Spectator representation of detached ‘objective’ inquiry.
Through his personal interactions Hawking reveals himself as a Participant, concerned with ‘how we should live’. He steers us toward a more desirable, moral future.
The new post-scientific Participant understanding of the universe requires a paradigm shift to a More General Theory that can both explain the successes of science and yet understand them in a new way. In the More General Theory, our embodied Participant inquiry is understood in a new way wherein the sciences and the humanities are necessarily re-unified.
  • asked a question related to History and Philosophy of Science
Question
4 answers
In his 1963 book "little science, big science" Derek de Solla Price shows science as aa whole been growing exponentially for 400 years. He hypothesises this to be the first part of a logistic curve. If his predictions were right the growth of science should have been started to decline by now. Are there recent measurements that can be compared to his 1963 estimates? And... was he right?
Relevant answer
Answer
As differentiation is vital for growth, even in a complex system such as the global scientific web, there is evidence of convergence spanning the period 1993-2008, that seems to suggest stagnation is underway globally and in some science portfolios, if we assume a logistic growth function.
  • asked a question related to History and Philosophy of Science
Question
75 answers
Through many discussions in RearchGate, I came to recognize that majority of economists are still deeply influenced by the Friedmanian methodology. An evidence is the fact that they take little care for the economic consistency and relevance of the model. They pay enormous time and efforts in "empirical studies" and discuss the result, but they rarely question if the basic theory on which their model lies is sensible. This ubiquitous tendency gives grave effects in economics: neglect of theory and indulgence in empirics. I wonder why people do not argue this state of economics. Economic science should take back a more suitable balance between theory and empirics. 
It is clear that we should distinguish two levels of Friedmanian methodology.
 (1) Friedman's methodology and thought that is written in the texts, more specifically in his article The Methodology of Positive Economics (Chapter 7 of Essays in positive economics, 1953).
(2) The methodology that is believed to be Friedan's thought.
 Apparently, (2) is much more important for this question. I see dozens of papers that examines Friedmanian methodology based on his text. Many of them detect that widely spread understanding is not correctly reflecting Friedman's original message. They may be right, but what is important is the widely spread belief in the name of Milton Friedman.
Relevant answer
Dear Shiozawa sensei and ResearchGate community,
I could not agree more with you when you state that all data-first theorist like Hoover, Hendry, Juselius, Johansen, Spanos are deeply influenced by F53. In the end, all of them follow a marshallian approach. According to the four aspects of scientific research, they start from (3) and end up in (1). Regarding (3), it is necessary to recall that "Data-First" theorist do not transform or curate data since they are "market processes" and, according to Hendry (2011), are subject to three kinds of unpredictabilities: intrinsic, instance and extrinsic. In other words they "let the data speak for themselves".
However, I don’t think the vast majority of economics are influenced by F53 Positivism or Popperian Falsificationism in strict sense inasmuch as RBC and DSGE models (the widespread models in Economics), whose predictive power is not good, have not being ruled out. Professor Mário Amorim Lopes explanation about popperian epistemological approach on social sciences was really clear and contundent. For instance, these models were not able to predict 2007/08 Financial Crisis, they were not able to survive falsifications, albeit they are still used for the vast majority of Central Banks in several countries. Kirman (2010) stated that “The Economic Crisis is a Crisis for Economic Theory”.
Now the question is, are DSGE models the best theory available? Are there other theories which are able to predict Economic Crisis? Kirman (2010) supports the idea that Shiozawa sensei stands for (so do I): viewing an economy as a complex adaptive system, a set of interdependent elements (agents) organized in networks (without a central control) which produce emerging aggregates and have the properties of adaptation and self-organization. In that sense, to overcome DSGE scenarios with representative agents, rational expectations, walrasian law (markets empying) and stochastic trend; it is necessary to build models that explain and predict economies with contagion, interaction, interdependence, networks and trust.
So far, we have identified that it is necessary to construct models which consider Economic Crises as inherent to the evolution of the complex system. But can we identify the evolution of the system? This responsibility lies in two different hypotheses: i) Former economic theories that have been ignored like the Financial Instability Hypothesis by Hyman Minsky; and ii) Approaches from other disciplines such as: Econophysics (see Jovanovic, F. y Schinkus, C., 2013; Rickles, D., 2008 and Sornette, D., y Zhou, W., 2007).
Allow me to discuss some ideas on Econophysics (I am deeply interested in this field). First of all it is necessary to recall that Financial Markets Data present certain stylized facts: i) Fat-tailed distributions (Instance Unpredictability, Hendry (2013) – Taleb’s Black Swan); ii) Volatility; iii) Autocorrelations (memory); iv) Leptokurtosis and v) Clustering. According to this, the normal distribution, martingales and random-walks which are the battlehorses of Eficient Market Hypothesis by Fama and therefore DSGE models, does not shed a light on Financial Market Data. On the other hand, Econophysics put forward the use of “Truncated Levy-Pareto” distributions which address all those stylized facts stated above. These distributions are bell-shaped like Gaussian distributions but unlike these ones, they assign bigger probability to the events in the center and the tails of the distribution (Economic Crises). (Jovanovic, F. y Schinkus, 2013).
In that sense, given that Econophysics view the economies as a Complex Adaptative System and provides a good explanation on Economic Crises, why DSGE models are still used? I think he answer to this question responds to interests (professor Karlsson emphasized on it above) and the arrogance of most Orthodox Economists. They are reluctant to ruling out DSGE models and accept developments coming from other disciplines outside economics. I do agree with Moisés Naím when he states that “while there may be budding intentions to appeal to other disciplines in order to enrich their theories (especially psychology and neuroscience), the reality is that economists almost exclusively study—and cite—each other”.  (http://www.theatlantic.com/business/archive/2015/04/economists-still-think-economics-is-the-best/390063/)
To sum up, I think neither Friedmanian Positivism nor Popperian Falsificationism is followed in strict sense by the vast majority of economists. The current bulk of models do not care about predictions, they just follow the “discipline of equilibrium” (Representative Agents, Walras law and Rational Expectations). I exposed the example of Economic Crises and Financial Markets inasmuch as it is the most important falsification of DSGE models, but there are other falsifications in other fields like Economic Growth and Development (my dissertation states about it but unfortunately it is in Spanish and I have not translated it to English yet, my apologies).
Thanks a lot for sharing your valuable concepts on this related topics
Édgar    
REFERENCES
1. Hendry,D. (2011). "Unpredictability in Economic Analyis, Econometric Modelling and Forecasting," Economics Series Working Papers 551, University of Oxford, Department of Economics.
2. Jovanovic, F. y Schinkus, C. (2013a). Towards a transdisciplinary econophysics, Journal of Economic Methodology. Volume 20, pp. 164-183
3. _______________________ (2013b). Econophysics: A new challenge for financial economics? Cambridge University Press, 319-352.
4. Kirman, A. (2010). The economic crisis is a crisis for economic theory. CESifo Economic Studies 56: 483-535.
5. Rickles, D. (2008). Econophysics and the complexity of financial markets, Handbook of the philosophy of science, Volume 10, pp. 133-152.
6. Sornette, D., y Zhou, W. (2007). Self-organizing ising model of financial markets. The European Physical Journal B, 55(2), 175-181
  • asked a question related to History and Philosophy of Science
Question
7 answers
Verificationism (according to Wikipedia) is an epistemological and philosophical positioning that considers necessary and sufficient a criterion of verification for acceptance or validation of a hypothesis, a theory or a single statement or proposition. Essentially the verificationism says that a statement, added to a scientific theory, which can not be verified, is not necessarily false, but basically meaningless because it is not demonstrable at the empirical evidence of the facts. There could in fact be multiple statements inherently logical for the explanation / interpretation of a certain phenomenon, which, however, in principle only one by definition is true.
Nonsense does not mean false; only its value of truth can not be decided and then such a proposition can have no claim to be cognitive or foundational in scientific theory. It is defined a proposition any statement that may be assigned a truth value (in the classical logic, true or false). A proposition for which it is not possible to attribute this value is therefore a statement devoid of verifiability and so, for this kind of epistemology, not with any sense, and finally to be eliminated as mere opinion or metaphysical proposition. Verificationism is usually associated with the logical positivism of the Vienna Circle, in particular to one of its greatest exponents, Moritz Schlick, whose basic thesis can be summarized as follows:
The propositions with sense are those that can be verified empirically.
Science through the scientific method is the cognitive activity par excellence, since bases the truth of his propositions on this verificationist criterion .
The propositions of metaphysics are meaningless as they are based on illusory and unverifiable concepts .The propositions of metaphysics, says Carnap, express at most feelings or needs.
The valid propositions are, as had claimed the English empiricist Hume, the analytical ones, which express relationships between ideas (like mathematical propositions), and propositions that express facts (such as the propositions of physics). Math, as logic, does not express anything of the world, it should not be empirically verifiable, but must serve to concatenate propositions among themselves those verifiable and meaningful to give them the character of generality that is missing for the contingent propositions.
• The purpose of philosophy is to perform a critique of knowledge in order to eliminate all nonsensical propositions that claim to be cognitive. The philosopher must be able to perform on the language both a semantic analysis (relationship reality-language) and a syntactic analysis (ratio of the signs as they are linked together).
Verificationism has as a structural basis to find a connection between statements and experience, that is, sensations that give meaning to those. This connection is called verification.
The epistemological attitude that gives rise to verificationism, can be found within the history of philosophy and science as early as the Greek philosophy, to Thomas Aquinas passing by William of Occam, and English empiricism, positivism and Empiriocriticism of Avenarius and Mach.
According to English empiricism (whose leading exponents can be considered Locke, Berkeley and Hume) the only source of knowledge is experience.
As Berkeley says, in fact, "the objects of human knowledge are or ideas really impressed by the senses or ideas formed with the help of memory and imagination composing or dividing those perceived by the senses." So there is no other way of formulating sentences or judgments from the data of experience and the only way to verify the truth value is still using experience. The judgments that are thus based on data that can not be verified through experience do not have sense and are therefore to be rejected as unscientific.
A position that seriously reflects the consequences of empiricism is the version of Hume, who, considering that only experience can provide the truth value of a proposition, rejects all of them that claim to have universal validity. A law becomes true only if verified, but once it is verified, through experience, nothing can guarantee that the experience will occur whenever you present similar conditions that made it possible. The verification of an empirical proposition is always contingent, never needed. Difficult for Hume, therefore, is to give a definitive foundation to the same science in the traditional sense, i.e. as a set of knowledge that be certain and necessary.
Sciences, says the positivist Comte, must seek the immutable laws of nature and as such be verified regardless of any contingent experience that shows them to senses or should occur whenever the law so provides.
Some positivists (principle of verification ‘strong’) note, however, that the principle of verifiability makes significant some metaphysical judgments, such as "The soul is immortal." Indeed, there is a method of verification and simply “wait a while and die”. To avoid that statements of this type can be equipped with sense, it is processed a stronger version of the principle of verifiability. This states that a judgment has meaning only because it can be shown definitively true or false; i.e. it must give an experience that can show this value of truth.
This version is called strong because of the fact that it excludes that any knowledge be given  that is not empirical and logical and therefore excludes that a sense can be given to any expression that is not the result of empirical knowledge or logical deduction derived from empirical propositions. This version of verificationism will be criticized by some positivists less radical, as Neurath and Carnap, for the simple fact that, if to give sense to a proposition is necessary its verification, even the principle of verifiability itself must be verified, and this It is not possible.
Numerous propositions of common use, whose meaning seems clear for the terms that we use, are unverifiable as statements that express the past or the future, such as “Churchill sneezed 47 times in 1949” or "Tomorrow it will rain." These propositions can, in principle, be verified, then it can be provided a method for the verification and for the principle of verifiability ‘weak version’ are equipped with meaning, but not for the ‘strong version’; they are only nonsense.
There are to be rejected the assertions about the Absolute and in general of metaphysical nature, at least as propositions to which it is possible to apply the positive verificationist method, even though this does not exclude its existence: to try to deny a metaphysical proposition has the same meaning as to try to prove it. The metaphysical propositions are therefore omitted, unrebutted.
Comte rejects the so-called absolute empiricism, which states that any proposition that is not established by the facts is to be rejected as senseless and therefore not liable to be taken as a scientific proposition.
Special mention must be made of math, no science, for Comte, but language and therefore the basis of any positive science. Mathematics as well as logic, as will say the logical empiricists, has the purpose of showing the connections between propositions in order to maintain the truth value of these, not to produce new values. The propositions of mathematics are ‘a priori’ truth, therefore, as such, can not be verified and therefore they say nothing of the world, but tell us how of the world it must be spoken after having experienced it.
The critique perhaps best known to the principle of verifiability is provided by Popper. He, though being its main critic, never abandons the beliefs set in the positivist poster and the idea that science has a rational and deductive structure, though describable in ways other than those contemplated by Schlick. In particular the principle of verification, weak and strong version, is abolished and replaced by that of falsifiability. This principle is in fact an admission of the impossibility of science to arrive at statements that they claim to be checked as they are, and also a condemnation of the principle of induction when it claims to provide a basis for the formulation of necessary laws . Popper says that billions of checks are not enough to determine if a given theory is certain; it is enough a falsification to show it is not true. The criterion of controllability of Carnap becomes the possibility of a statement to be subjected to falsification and the structure of science, as already stated by Hume, is that it does not confirm the hypothesis, to the maximum falsifies it. The experiments themselves to which are subject the laws of science are useful when trying to falsify the laws themselves foreseen by them and not if they try to verify them.
Criticism burying verificationism come from the so-called post-positivist epistemology, whose leading exponents are Kuhn, Lakatos and Feyerabend. In varying degrees all three claim that a fact can not be verified because the bare facts not even exist, but can only be represented in a theory already considered scientific. Therefore, there is no distinction between terms of observation and theoretical terms, and even the same concepts considered basic of science possess the same meaning if designed within two different theories (think for example to the concept of mass for Newton and Einstein) . According to post-positivism also science itself is not empirical because even its data are not empirically verifiable and there is no criterion of significance, that is, it is not possible to separate a scientific statement from one that concerns other human activities.
Now, finally, we follow the position of Professor Franco Giudice for whom in the work “Controllability and meaning” (1936-1937) Rudolf Carnap recognizes that absolute verification in science is almost impossible. It must, therefore, change the criterion of significance; the principle of verification must be replaced with the concept of confirmation: a proposition is significant if, and only if, it is confirmable. The criterion of verifiability of propositions consists only of confirmations gradually increasing. Thus, the acceptance or rejection of a proposition depends on the conventional decision to consider a  given degree of confirmation of the proposition as sufficient or insufficient. Then, the meaning of a proposition is determined by the conditions of its verification (verification principle): a proposition is significant if, and only if, there is an empirical method for deciding if it is true or false. If such a method is not given, then it is an insignificant pseudo-proposition.
Relevant answer
What you are referring to when saying "There could in fact be multiple statements inherently logical for the explanation / interpretation of a certain phenomenon" is what philosophers of science call 'the underdetermination of theory by data'.  That is, the same phenomenon can be explained by multiple theories, which are often incompatible with each other.  this underdetermination increases when our theories postulate entities or processes that are unobservable.  Thus, given the nature of the scientific method and especially the inherent problem of underdetermination, no philosopher of science (not even the positivists) ever endorsed verificationism. This is because, whether one is dealing with universal laws or statistical laws, the verification of a hypothesis would require that all possible cases covered by that hypothesis be tested and that each of these tests confirm the hypothesis.  But it is impossible to test all possible cases, thus the best we can have is a high degree of confirmation.  One must recall that, for the logical positivists, the only real statements that could be verified were either analytic statements, whose truth could be established a priori via a simple analysis of the relation between subject and predicate, and observation statements, whose truth could be established by comparing the statement with a direct observation. But, scientific laws and hypotheses do not meet either of these criteria because they explain phenomena by reference to theoretical entities (which are unobservable by definition) and because they cover all possible cases of the phenomena in question (which cannot be observed by definition).  In The Philosophical Foundations of Physics, Rudolf Carnap, himself one of the great proponents of logical positivism, argues precisely this point by stating "At no point is it possible to arrive at complete verification of a law.  In fact, we should not speak of 'verification' at all - if by the word we mean definitive establishment of truth - but only of confirmation." 
  • asked a question related to History and Philosophy of Science
Question
71 answers
Should hypotheses always be based on a theory? I will provide an example here without variable names. I am reading a paper where the authors argue that X (an action) should be related to Y (an emotion). In order to support this argument the authors suggest that when individuals engage in X, they are more likely to feel a sense of absorption and thus they should experience Y. There is no theory here to support the relationship between X and Y. They are also not proposing absorption as the mediator. They are just using this variable to explain why X should lead to Y. Would this argument be stronger if I used a theory to support the relationship between X and Y? Can someone refer me to a research paper that emphasizes the need for theory driven hypotheses? Thanks!
Relevant answer
Answer
A hypothesis is a tentative proposition or posit based on insufficient knowlege to be sure that it is factual. A hypothesis is proposed for testing.
If much testing affirms the correctness of a hypothesis, and it is generaly accepted, it then can become accepted as a theory. However, theorys can still be challenged and they may be modified or even discarded altogether, if much contrary knowledge is acquired and presented.
If a theory is rock-solid and apparently is beyond any dispute, it can be accepted as a law. There are laws in physics, for example. However laws are very scarse, or non-exsitent, in other diciplines such as biology.  
Paradigms  are also interesting, if you are keen. They are, very roughly, generaly accepted principles within which research is coducted, but they may be overthrown and replaced by a new paradigm during a scientific revolution.
Note that in non-scientific language, in common speech, even an idea or a train of thought may commonly be referred to as a 'theory', and the word hypothesis is not generaly known or used, and law is usualy used only to refer to the legal system.
I hope this helps Alex,
Regards,
Keith
  • asked a question related to History and Philosophy of Science
Question
94 answers
I am quite surprised everybody says Galileo is the one who first scientifically described the relativity of motion which is contrary to the fact that at least Copernicus did it earlier and  in quite explicit form:
"Every observed change of place is caused by a motion of either the observed
object or the observer or, of course, by an unequal displacement of each. For when things move with equal speed in the same direction, the motion is not perceived, as between the observed object and the observer."
NICHOLAS COPERNICUS OF TORUÑ,  THE REVOLUTIONS OF THE HEAVENLY SPHERES 1543.
I am also surprised from time to time by statements that it was Galileo who proposed heliocentric system.
Its an interesting aspect of distortion of historical facts. Any thoughts or other examples of similar injustice? Why does it take place?
Relevant answer
Answer
Apart from the issues of the relativity of motion and the heliocentric picture of what we now call the solar system, you ask why injustices of this kind take place. That's not really a philosophical question, but my unphilosophical answer is that the professionals and academics who pontificate en passant upon the history of science have usually not done their homework, nor do they care to do it, preferring to pass on whatever gossip or confabulation fits their rhetorical contrivance of the moment. It's even worse than you might think. If you have written serious scientific review articles, you will find that the "findings" of what are considered to be important scientific papers are regularly mis-described in "the literature". Authors who refer to other authors very often also don't care to do their homework. Academia has become pretty shoddy—much of what's produced is bad journalism, and there's not much lower than that! But ignorant comments on the history of science are particularly ubiquitous because so many contemporary scientists (and philosophers) think of earlier science as just a rambling, obsolete course of misguided and inept fumbling that is of no real interest — now that we know the truth! — although frequently tempted by the urge to identify among their forbears some "good guys" or "bad guys" for rhetorical purposes. As the French would say, "It's not serious".
  • asked a question related to History and Philosophy of Science
Question
277 answers
This refers to the recent experiments of Radin et al :
1) D. Radin, L. Michel, K. Galdamez, P. Wendland, R Rickenbach and A. Delorme
Physics Essays, 25, 2, 157 (2012).
2)  D. Radin, L. Michel, J. Johnston and A. Delorme, Physics Essays, 26, 4, 553 (2013).
These experiments show that observers can affect the outcome of a double slit experiments as evidenced by a definite change in the interference pattern.
It requires urgent attention from the scientific community, especially Physicists.
If these observed effects are real, then we must have a scientific theory that can account for them.
Relevant answer
Answer
@Rajat.
I don't know to which point of the debate the queer thoughts of your first paragraph are meant to be a contribution.
For science it is not a problem to be faced with empirical facts for which an explanation is presently out of reach. During most part of the ninetenth century astronomers knew that no chemical reaction could deliver the radiative power that was observed to originate in stars. One had to wait till the discovery of atomic nuclei and a preliminary understanding of their internal workings before the radiation of stars could be explained. If there are clear facts science will find an explanation, perhaps only after a few hundred years. 
Pseudoscience is characterized by the missing of clear facts. Pseudoscientists have enough knowledge of science that they can impress uncritical people but are unable to arange experiments and observations in a manner that repetition by independent groups reproduces the original findings. 
  • asked a question related to History and Philosophy of Science
Question
25 answers
I'm interested in comparing Indigenous research methods with other ancient cultures. Indigenous research methods are relatively well documented for Australian Aboriginals, New Zealand Maori and North American Indians. I was hoping to locate examples of other non-Western (non-Eurocentric) research methods used by cultures, such as China, Africa, South America, India etc. For example, what methodology did the Chinese use to develop their knowledge of Chinese medicine? I realise these methods may not have been documented or may be in a non-English language. Any leads would be helpful at this stage.
Relevant answer
Though I am not a specialist on ancient science, as Egyptologist I can recommend some references for medicine and other fields, as, for instance, J. F. Nunn, 'Ancient Egyptian Medicine', where you can easily find the medical procedures and knowledge of ancient Egyptians. You can also find some remarks in:
-N. Baum, "L'organisation du règne végétal dans l'Égypte ancienne...", in: S. Aufrère (ed.), 'Encyclopédie religieuse de l'univers végétal de l'Égypte ancienne I', Montpellier, pp. 421-443, 1999.
-N. Beaux, 'Le cabinet de curiosites de Thoutmosis III. Plantes et animaux du 'jardin botanique' de Karnak', Leuven, 1990.
-S. Uljas, "Linguistic Conciousness", in: UEE, available at the website: https://escholarship.org/uc/item/0rb1k58f 
Of course, some interesting remarks are avalaible in the classical work of C. Lévi-Strauss, 'La pensée sauvage'.
I hope this can be useful for you.
Regards
  • asked a question related to History and Philosophy of Science
Question
126 answers
While scientific cosmology rarely occurs in the work Karl Popper, nevertheless it is a subject that interested him. The problem now is whether falsifiability criterion can be used for cosmology theories.
For instance, there are certain issues in cosmology which have never been refuted, but instead the same methods are used over and over despite their lack of observational support, for instance mutliverse idea (often used in string theory) and also Wheeler DeWitt equation (often used in quantum cosmology).
So do you think that Popperian falsifiability can be applied to cosmology science too? Your comments are welcome.
Relevant answer
Answer
Clifford,
Apparently, your answer to my question is negative, namely, Poppers falsifiability is useless in exact sciences.  I fully concur with this conclusion if: 1) falsifying a theory is equivalent to refuting it, and 2) a refuted theory must be taken out of circulation.  Popper stated the first, while the second appears to be a generally accepted implication. If the latter is not true, what is the purpose of applying falsifiability to physics? However, if the implication is correct, it contradicts the whole history of physics, which shows that nothing dramatic happened to a theory which did not agree with a certain experiment.   Physicists continued using it – even unmodified - in the areas (or under conditions) where such disagreements do not occur. And frequently they were capable of modifying the theory so as to explain not only the experiment in question but a range of other phenomena.
Apparently, Popper reasoned as follows: AFTER a new theory is published, someone offers a new experiments which the theory cannot explain, and therefore it is refuted.  In reality, many authors were aware of such exceptions even BEFORE publishing their theories.  Nonetheless, they proceeded with their publications, because they believed that a theory which explained even a few phenomena has a right to exist.  They hoped that future developments will extend the range of applications of their theories. 
Here is an example.  Thomas Young’s 1801 paper had a very general title “On the Theory of Light and Colours”.  Yet, he did not plan to explain in that paper ALL phenomena of light and colors.  In fact, he applied then his theory only to 3 phenomena of colors, namely, those produced by parallel scratches on glass, by thin films, and by thick glass plates imperfectly polished.  In subsequent papers he extended his theory to a few more phenomena, then Fresnel and Arago added some more, even without modifying the theory.
One can multiply such examples at will, and the general conclusion will be that the concept of falsifiability had been useless in the older physics.  Incidentally, originally Popper introduced the concept to distinguish “scientific” theories from “non-scientific” ones, such as astrology or Marxist theory of history, which is not the same as separating “true” physical theories from the “false” ones.
  • asked a question related to History and Philosophy of Science
Question
23 answers
          My objective is to create, accumulate physical evidence and demonstrate irrefutable physical evidence to prove that the existing definitions for software components and CBSE/CBSD are fundamentally flawed. Today no computer science text book for introducing software components and CBSD (Component based design for software products) presents assumptions (i.e. first principles) that resulted in such flawed definitions for software components and CBSD.
In real science, anything not having irrefutable proof is an assumption. What are the undocumented scientific assumptions (or first principles) at the root of computer science that resulted in fundamentally flawed definitions for so called software components and CBD (Component Based Design) for software products? Each of the definitions for each kind of so called software components has no basis in reality but in clear contradiction to the facts we know about the physical functional components for achieving CBD of physical products. What are the undocumented assumptions that forced researchers to define properties of software components, without giving any consideration to reality and facts we all knows about the physical functional components and CBD of physical products?
Except text books for computer science or software engineering for introducing software components and CBSD (Component Based Design for software products), I believe, first chapter of any text book for any other scientific discipline discusses first principles at the root of the scientific discipline. Each of the definitions and concepts of the scientific discipline is derived by relying on the first principles, observations (e.g. including empirical results) and by applying sound rational reasoning. For example, any text book on basic sciences for school kids starts by teaching that “Copernicus discovered that the Sun is at the center”. This is one of the first principles at the root of our scientific knowledge, so if it is wrong, a large portion of our scientific knowledge would end up invalid.
I asked countless expert, why we need different and new description (i.e. definitions and/or list of properties) for software components and CBSD, where the new description, properties and observations are in clear contradiction to the facts, concepts and observations we know about the physical functional components and CBD of large physical products (having at least a dozen physical functional components). I was given many excuses/answers, such as, software is different/unique or it is impossible to invent software components equivalent to the physical functional components.
All such excuses are mere undocumented assumptions. It is impossible to find any evidence that any one ever validated these assumptions. Such assumptions must be documented, but no text book or paper on software components even mentioned about the baseless assumptions they relied on to conclude that each kind of useful parts is a kind of software components, for example, reusable software parts are a kind of software components. Then CBD for software is defined as using such fake components. Using highly reusable ingredient parts (e.g. plastic, steel, cement, alloy or silicon in wafers) is not CBD. If anyone asks 10 different experts for definition/description for the software components, he gets 10 different answers (without any basis in reality we know about the physical components). Only the God has more mysterious descriptions, as if no one alieve seen the physical functional components.
The existing descriptions and definitions for so called CBSD and so called software components were invented and made out of thin air (based on wishful thinking) by relying on such undocumented myths. Today many experts defend the definitions by using such undocumented myths as inalienable truths of nature, not much different from how researchers defended epicycles by relying on assumption ‘the Earth is static’ up until 500 years ago. Also most of the concepts of CBSD and software components created during past 50 years derived by relying on such fundamentally flawed definitions of software components/CBSD (where the definitions, properties and descriptions are rooted in undocumented and unsubstantiated assumptions).
Is there any proof that it is impossible to invent real software components equivalent to the physical functional components for achieving real CBSD (CBD for software products), where real CBSD is equivalent to the CBD of large physical products (having at least a dozen physical functional components)? There exists no proof for such assumptions are accurate, so it is wrong to rely on such unsubstantiated assumptions. It is fundamental error, if such assumptions (i.e. first principles) are not documented.
I strongly believe, such assumptions must be documented in the first chapters of each of the respective scientific disciplines, because it forces us to keep the assumptions on the radar of our collective conscious and compels future researchers to validate the assumptions (i.e. first principles), for example, when technology makes sufficient progress for validating the assumptions.
I am not saying, it is wrong to make such assumptions/definitions created for software components 50 years ago. But it is huge error to not documenting the assumptions, on which they relied upon for making such different and new definitions (by ignoring reality and known facts). Such assumptions may be acceptable and true 50 years ago (when computer science and software engineering was in infancy and assembly language and FORTRAN were leading edge languages), but are such assumptions still valid? If each of the first principles (i.e. assumptions) is a proven fact, who proved it and where can I find the proof? Such information must be presented in the first chapters.
In real science, anything not having irrefutable proof is an assumption. Is such undocumented unsubstantiated assumptions are facts? Don’t the computer science text books on software components need to document proof for such assumptions before relying on such speculative unsubstantiated assumptions for defining the nature and properties of software components? All the definitions and concepts for software components and CBSD could be wrong, if the undocumented and unsubstantiated assumptions end up having huge errors.
My objective is to provide physical evidence (i) to prove that it is possible to discover accurate descriptions for the physical functional components and CBD of large physical products (having at least a dozen physical functional components), and (ii) to prove that it is not hard to invent real software components (that satisfy the accurate description for the physical functional components) for achieving real CBSD (that satisfy the accurate description for the CBD of physical products), once the accurate descriptions are discovered.
It is impossible to expose any error at the root of any deeply entrenched paradigm such as CBSE/CBSD (evolving for 50 years) and geocentric paradigm (evolved for 1000 years). For example, assumption “the Earth is static” considered an inalienable truth (not only of nature and but also of the God/Bible) for thousands of years, but ended up a flaw and sidetracked research efforts of countless researchers of basic sciences into a scientific crisis. Now we know, no meaningful scientific progress would have been possible, if that error was not yet exposed. Only possible way expose such error is showing physical evidence, even if most experts refuse to see the physical evidence, by finding few experts who are willing to see the physical evidence with open mind.
I have lot of physical evidence and now in the process of building a team of engineers and necessary tools for building software applications by assembling real software components for achieving real CBSD (e.g. for achieving CBD-structure http://real-software-components.com/CBD/CBD-structure.html by using CBD-process http://real-software-components.com/CBD/CBD-process.html). When our tools and team is ready, we should be able to build any GUI application by assembling real software components.
In real science, any thing not having irrefutable proof is an assumption. Any real scientific discipline must document each of the assumptions (i.e. first principles) at the root of the scientific discipline, before relying on the assumptions to derive concepts, definitions and observations (perceived to be accurate, only if the assumptions are proven to be True):  https://www.researchgate.net/publication/273897031_In_real_science_anything_not_having_proof_is_an_assumption_and_such_assumptions_must_be_documented_before_relying_on_them_to_create_definitionsconcepts
I tried to write papers and give presentations to educate about the error, but none of them worked. I learned in hard way, that this kind of complex paradigm shift can’t happen in just couple of hour’s presentation or by reading 15 to 20 page long papers. Only possible way left for me to expose the flawed first principles at the root of any deeply entrenched paradigm is by finding experts willing to see physical evidence and showing them the physical evidence: https://www.researchgate.net/publication/273897524_What_kind_of_physical_evidence_is_needed__How_can_I_provide_such_physical_evidence_to_expose_undocumented_and_flawed_assumptions_at_the_root_of_definitions_for_CBSDcomponents
So I am planning to work with willing customers to build their applications, which gives us few weeks to even couple of months time to work with them to build their software by identifying ‘self-contained features and functionality’ that can be designed as replaceable components to achieve real CBSD.
How can I find experts or companies willing to work with us to see the physical evidence, for example, by allowing us the work with them to implement their applications as a CBD-structure? What kind of physical evidence would be compelling, when any one willing to give us a chance (at no cost to them, since we can work for free to provide compelling physical evidence)? I failed so many times in this complex effort, so I am not sure what could work? Does this work?
Best Regards,
Raju
Relevant answer
Answer
Raju,
"When any scientific discipline was in infancy, researchers are forced to make assumptions."
We are always making assumptions. Infancy or not. You've mentioned the geocentric model of the universe. It is as truth as an assumption as it is the non-geocentric model of the universe. Although I won't try it (I'm guessing it would take a while) it's probably possible to make our whole physics based on the geocentric model with no significant loss of accuracy. It's numbers and you can "engineer" your way to whatever assumption you want to believe. The reason why non-geocentric model is accepted is just because it makes more intuitive sense. So common sense is the great decider.
Science is not the art of truth, it's the art of accurate. Better science does not mean you are closer to the truth (you would have to know the truth to be able to claim that!), it just means you are able to model more accurately a phenomena. Nowadays modern societies seem to view science almost to religious status (we keep assuming dogmas, only this time with university degrees). Personally I see it as a fork. I know it, I use it and than I wash it for the next meal.
Besides the philosophical ideas I would suggest you to define some few points open to debate. It's very complicated to debate such a broad subject. It's too vague. I've read your comments and I still can't put my finger in what exactly are your trying to reform (in good part maybe because I don't have enough skill in some of the areas you're touching).
  • asked a question related to History and Philosophy of Science
Question
15 answers
I am looking for information on the history of the development of statistical significance formulae, the mathematical calculations and why they were chosen.
I would also like to learn the same about effect size.
Thanks!
Relevant answer
Answer
Depends what you mean by statistical significance. Several books cover the history of statistics and probability pre-1900 (Stigler's History of Statistics and Hacking's Emergence of Probability, being two of the most well known). For more on the past 100 years, Gigerenzer's Empire of Chance is excellent. There are others, but I'll let other commentators list their favorites.
If you mean statistical significance as just the approaches of Fisher, Neyman, Pearson, etc., to hypothesis testing and p values, their papers are available, and much discussion of them (actually, Gigerenzer is the author of one on this, something like the Id, Ego, and Super-Ego of statistical reasoning ... heh, its here (http://www.mpib-berlin.mpg.de/en/institut/dok/full/gg/ggstehfda/ggstehfda.html)). Another is Lehmann's Fisher, Neyman, and the creation of classical statistics.
  • asked a question related to History and Philosophy of Science
Question
64 answers
     
It is known that physics is empirical science, in the sense that all propositions should be verified by experiments. But Bertrand Russell once remarked that the principle of verifiability itself cannot be verified, therefore it cannot be considered a principle of science.
In a 1917 paper, Bertrand suggested sense-data to replace the problem of verifiability in physics science (http://selfpace.uconn.edu/class/ana/RussellRelationSenseData.pdf), but later he changed his mind. see http://www.mcps.umn.edu/philosophy/12_8savage.pdf
So what do you think? Is there a role for sense-data in epistemology of modern physics?
Relevant answer
Answer
Yes, I always find Tim good at lucid exposition of a problem and its origins. I am less sure that he is good at a radical solution, but that's more difficult!
  • asked a question related to History and Philosophy of Science
Question
5 answers
Section II of “The fixation of belief” [2] opens dramatically with a one-premise argument—Peirce’s truth-preservation argument PTPA—concluding that truth-preservation is necessary and sufficient for validity: he uses ‘good’ interchangeably with ‘valid’. He premises an epistemic function and concludes an ontic nature.
The object of reasoning is determining from what we know something not known.
Consequently, reasoning is good if it gives true conclusions from true premises, and not otherwise.
Assuming Peirce’s premise for purposes of discussion, it becomes clear that PTPA is a formal fallacy: reasoning that concludes one of its known premises is truth-preserving without “determining” something not known. It is conceivable that Peirce’s conclusion be false with his premise true [1, pp. 19ff].
The above invalidation of PTPA overlooks epistemically important points that independently invalidate PTPA: nothing in the conclusion is about reasoning producing knowledge of the conclusion from premises known true: in fact, nothing is about premises known to be true, nothing is about conclusions known to be true, and nothing is about reasoning being knowledge-preservative.
The following is an emended form of PTPA.
One object of reasoning is determining from what we know something not known.
Consequently, reasoning is good if it gives knowledge of true conclusions not among the premises from premises known to be true, and not otherwise.
PTPA has other flaws. For example, besides being a formal non-sequitur, PTPA is also a petitio-principi [1, pp.34ff]. Peirce’s premise not only isn’t known to be true—which would be enough to establish question-begging—it’s false: reasoning also determines consequences of premises not known to be true [1, pp. 17f].
[1] JOHN CORCORAN, Argumentations and logic, Argumentation, vol. 3 (1989), pp. 17–43.
[2] CHARLES SANDERS PEIRCE, The fixation of belief, Popular Science Monthly. vol. 12 (1877), pp. 1–15.
Q1 Did Peirce ever retract PTPA?
Q2 Has PTPA been discussed in the literature?
Q3 Did Peirce ever recognize consequence-preservation as a desideratum of reasoning?
Q4 Did Peirce ever recognize knowledge-preservation as a desideratum of reasoning?
Q5 Did Peirce ever retract the premise or the conclusion of PTPA?
Relevant answer
Answer
One shouldn't give an argument about what "good reasoning" is. One should just stipulate a definition, and leave it at that.
For what it's worth, however, perhaps Pierce didn't take himself to be giving an argument about what "good reasoning" is, and his "premise" was just his unorthodox way of stipulating a definition of "good reasoning," so that, by "the object of reasoning," he meant "that which would make reasoning good." On this reading, the purpose of his conclusion is simply to point out a CONSEQUENCE of his definition of "good reasoning" -- namely, that valid reasoning is necessary and sufficient for good reasoning. On the face of it, THIS ARGUMENT -- from the definition to the consequence of the definition -- might seem to be invalid; yet it's hard to tell without knowing what he meant by "determining," etc.
  • asked a question related to History and Philosophy of Science
Question
115 answers
In The Nature of the Physical World, Eddington wrote:
The principle of indeterminacy. Thus far we have shown that modern physics is drifting away from the postulate that the future is predetermined, ignoring rather than deliberately rejecting it. With the discovery of the Principle of Indeterminacy its attitude has become definitely hostile.
Let us take the simplest case in which we think we can predict the future. Suppose we have a particle with known position and velocity at the present instant. Assuming that nothing interferes with it we can predict the position at a subsequent instant. ... It is just this simple prediction which the principle of indeterminacy expressly forbids. It states that we cannot know accurately both the velocity and the position of a particle at the present instant.
--end quotation
According to Eddington, then, we cannot predict the future of the particular particle beyond a level of accuracy related to the Planck constant (We can, in QM, predict only statistics of the results for similar particles). The outcome for a particular particle will fall within a range of possibilities, and this range can be predicted. But the specific outcome, regarding a particular particle is, we might say, sub-causal, and not subject to prediction. So, is universal causality (the claim that every event has a cause and when the same cause is repeated, the same result will follow) shown false as Eddington holds?
Relevant answer
Answer
Social Science has always followed the mothership of science, physics. But in the last few decades, esp. post heisenberg, physics has become comfortable with quantum worldview, but the social science researchers (who study or build our perception of causality) are still stuck with classical Newtonian worldview and therefore having tough time grappling the physical science discoveries around causality. There are five main aspects where this is becoming tough:-
1) Difference of perception of space-time characteristics by different observers (Theories of Relativity)
2) Unified Theories could exist (unified field theory, string theory..)
3) Effects with no physical/ observable influence medium (Quantum Entanglement)
4) Inseparability of microcosm and macrocosm (Single Electron Universe)
5)  No measurements or probabilistic measurement (Determinstic Quantum physics) Observation may not translate to measurement always. It's Ok, if the instruments can't measure, still an observation could be useful.
Consequently, social science research outputs a narrow view of reality, the part of reality which is measurable through classical means driven by newtonian paradigm. Since reality has many more dimensions, the experiential learning and practice are getting divorced from social science research due to this bottleneck of newtonian cognitive framework of causality.   
Attached is an 1876 English translation of classical sanskrit text, which was written to know reality through substance (not matter alone) and reasoning. We are trying to bring that into research methodologies for a a better view on causality. Hope this is useful.  
  • asked a question related to History and Philosophy of Science
Question
236 answers
It was true that mathematics was done in argumentation and discourse or rhetoric in ancient times. The 6 volumes of Euclid’s elements have no symbols in it to describe behaviors of properties at all except for the geometric objects. The symbols of arithmetic: =, +, -, X, ÷ were created in the 15th and 16th centuries which most people hard to believe it - you heard me write. The equality sign “=” and “+,-“ appeared in writing in 1575, the multiplication symbol “X “ was created in 1631, and the division sign “ ÷” was created in 1659. It will be to the contrary of the beliefs of most people as to how recent the creations of these symbols were.
It is because of lack of symbols that mathematics was not developed as fast as it has been after the times where symbols were introduced and representations, writing expressions and algebraic manipulations were made handy, enjoyable and easy.
These things made way to the progress of mathematics in to a galaxy – to become a galaxy of mathematics. What is your take on this issue and your expertise on the chronology of symbol creations and the advances mathematics made because of this?
http://Notation,%20notation,%20notation%20%20a%20brief%20history%20of%20mathematical%20symbols%20%20%20Joseph%20Mazur%20%20%20Science%20%20%20theguardian.com.htm
Relevant answer
Answer
Leibniz was the master of symbol creation!  He created symbols that packaged meaning,  helped cognition, stimulated generalization, and eased manipulation.  He thought about them with care before committing to their use.  William Oughtred invented hundreds of new symbols, but hardly any of them are still in use.  Goes to show that willy-nilly made symbols don't have a good survival rate, for good reasons.
  • asked a question related to History and Philosophy of Science
Question
148 answers
The British astrophysicist, A.S. Eddington wrote (1928), interpreting QM, "It has become doubtful whether it will ever be possible to construct a physical world solely out of the knowable - the guiding principle of our macroscopic theories. ...It seems more likely that we must be content to admit a mixture of the knowable and the unknowable. ...This means a denial of determinism, because the data required for a prediction of the future will include the unknowable elements of the past. I think it was Heisenberg who said, 'The question whether from a complete knowledge of the past we can predict the future, does not arise because a complete knowledge of the past involves a self-contradiction.' "
Does the uncertainty principle imply, then, that particular elements of the world are unknowable, - some things are knowable, others not, as Eddington has it? More generally, do results in physics tell us something substantial about epistemology - the theory of knowledge? Does epistemology thus have an empirical basis or empirical conditions it must adequately meet?
Relevant answer
Answer
Jerzy, Ray Streater is relatively well-known as a spokesperson for a community of people who do not believe in any shade or stripe of wave function realism (we're not even talking of wave function monism here.)
It is then already a matter of interpretation.
Sane & sound people will rather convincingly argue that a Schrödinger equation legitimately can apply to several variables, and that there is ample experimental evidence for that - which is however basically what Streater disputes if we follow his line of reasoning.
What I'm trying to say is that this line of argument is very far from being cut and dry.
  • asked a question related to History and Philosophy of Science
Question
27 answers
Many scientists differentiate the hard physical sciences from philosophy, some even say "that's not science its philosophy". Are they missing the point in a big way?
Relevant answer
Answer
There are philosophies that apply to all sciences (e.g. how to define, design and conduct a field experiment to test a hypothesis X, whatever the research domain; e.g. Hurlbert 1984; Ecology, Psychology, Human Sciences, Ethology, Political Sciences, Behaviour, etc.....) and there are philosophies that are specific to each research domain (e.g. how to define, design and conduct a playback experiment to test the messages and meanings of bird song in a single model species; Behavioral Ecology, Ornithology).
  • asked a question related to History and Philosophy of Science
Question
21 answers
I recently published my book "The Origin of Science" which can be downloaded at https://www.researchgate.net/profile/Louis_Liebenberg/publications/ I am interested in alternative theories on the origin of science and how this debate can lead to a better understanding of how our ability for scientific reasoning evolved.
Relevant answer
Answer
Hi Mike
Hunter-gatherers not only develop applied science, but also developed knowledge for the sake of knowledge. For example, the /Gwi Bushmen of the Kalahari have eleven species-specific names for ants, including the velvet ant (a wingless wasp), and termites. They have developed a level of detail in their knowledge of ants that far exceed the practical requirements of hunting. But as you point out, that fact that we can store more knowledge than is immediately relevant prepares us for unforeseen eventualities in the future.
I think the political dimension in modern science may well stifle creative innovation by limiting academic freedom. Government funding may make it possible to get a lot of research done in the sense of Kuhn's "normal science" - but creative innovation requires institutions to allow researchers academic freedom, or alternatively creative individuals may choose to work independently. Hunter-gatherers allow a large degree of "academic freedom" in the interpretation of animal tracks and signs - ultimately it is the predictive value of hypotheses that result in successful hunts.
  • asked a question related to History and Philosophy of Science
Question
40 answers
Is it reasonable to use these terms?
A number of papers have been published a long time ago, but still have many citations.
If it is possible, then is it predictable?
Is citation can be a suitable measure to judge the useful age of a paper?
Which papers have more useful lifetime or long expire date?
Thanks for your inputs.
Relevant answer
Answer
Why is it so important? Good scientists are like poets: they HAVE TO do reseach and publish (interal motivation). If the only goal is to get citations, it is narcissism, not science. Mendel was forgotten and his findings had to be reinvented later - but that does not change the fact that he was a great scientist.
  • asked a question related to History and Philosophy of Science
Question
10 answers
Back to my 2nd semester, I still remember those boring faces trying to hide their yawning during lectures on History of Science. But I found it unexpectedly interesting. Learning the manner of approach of ancient philosophers and naturalists was quite exciting. But it seems to me that most students neglect this valuable subject because their minds seem to be preoccupied with the notion that most of the thing taught in this course, like their manner of thinking about the cosmos, the earth, their perspective towards health and medicine, are almost apparent to everybody. But what they missed are what they ought to learn, their hardwork, their practices, their mode of approach, their determination, dedication at those days when everything seems to be mysterious, when there was no thing called apparent.
So what more can we learn from our forefathers? And how can this subject be popularized esp. among youngsters?
Relevant answer
Answer
I think much depends on the professor teaching the subject. It can be boring, as anything else. For me the most interesting aspect is that science is as much part of the culture as politics, economics, philosophy, law, religion or arts. If studying long range processes it is very interesting to observe the parallels between these phenomena. For me the most important consequence of studying the history of science was to realize that our current views of the world are but temporary interpretations. It does not mean that the are meaningless or purely subjective but that they are deeply historical, in a permanent state of transition. That makes these scientists humbler. Another important aspect is that several thing that seem very new have already been invented - sometimes only conceptually. There is a lot to learn from earlier approaches and a lot of inspiration can be gained. In our present rush for impact factors and money we rarely have time to study earlier science. Unfortunately most of us (inlcuding myself) do not speak ancient languages and the translations are necessarily interpretations in modern language. Most of the historians, however, who master these languages, are not (natural) scientists, so they do not necessarily recognize the signficance of what they read. There are rareg exceptions, however.
  • asked a question related to History and Philosophy of Science
Question
8 answers
Is there a relationship between history of science and philosophy of science?
Relevant answer
Answer
Meu caro Ourides, azar de quem não puder entender Português. Foi com prazer que li sua resposta. Terminei de ler "Adeus à Razão" do Feyerabend (Editora UNESP). Vou procurar pelo menos alguns dos livros que você menciona, pois do Koyré só conheço o "Do Mundo Fechado ao Universo Infinito" e do Bachelard não li nada além do que a Abril publicou na coleção "Os Pensadores".