All replies (1)

Muhammad Ali Hasnain
National University of Sciences and Technology
Rationalism is referring to rational behavior, and it can be based upon many kinds of evidence sources like testimony, history, empirical evidence, etc.. and so on. Critical entails a behavior of carefully analyzing something to the point where it is required. From there we can understand that critical rationalism means that the person is employing a behavior in which they are employing rationality to their thinking using different kinds of evidences as a base for their analysis.
On the other hand, empiricism entails that we are demanding about empirical evidence. Skepticism means that the person is employing a behavior of radically questioning something. So if we join both terms, it will mean that the person is asking simultaneous questions about something while demanding only empirical evidence.
So, from the above mentioned understanding we can conclude that critical rationalism is a concept and empirical skepticism is a type of critical rationalism but the latter employ a bit more strong behavior then the former.
So, I am not an expert on the field, neither do I know what these terms actually mean. But from names, I have derived this meaning which is quite sensible in my view
1 Recommendation

Similar questions and discussions

DESI Λ Constant Results: New physics or better measurements?
Question
11 answers
  • Emmanouil MarkoulakisEmmanouil Markoulakis
Watched this report video here:
latest results suggesting a diminishing variable cosmological Λ "constant?" and dark energy thus expansion of the universe might slow down especially over the last 7-8 billion years as shown in this diagram of the DESI results provided in this video report shown here.
However, could be not really new physics but just more accurate measurements and method. Seems to me that Λ cosmological "constant?" absolute value is pretty much fixed at 1. The small dip of about ≈0.98 could just be due to previous measurement methods limitations and inaccuracies. This could be easily verified by using the old methods again and this dip does not show up.
What is the argument against that these are just better more accurate measurements and observations compared to past measurements and methods, and not the result of potentially new physics discovered?
Do you know the real way of synthesis of simplest living matter from minerals under natural and laboratory conditions?
Discussion
11 replies
  • Victor OstrovskiiVictor Ostrovskii
This paper represents generalization of the research works published over the period from 2000 to 2023 in several tens of publications and presented at about 30 international scientific conferences in about 20 countries in the form of lectures and oral presentations. All publications relating to the problem of living matter origination are available at the ResearchGate site in the Victor Ostrovskii's and Elena Kadyshevich's pages.
Challenge: Support Your Theory
Discussion
41 replies
  • Deleted profile
Interesting comment, posted on a recent research document.
So I've laid out a few steps of authentication, based off of my mathematical, and physics-based education, and consulting with various other associates.
This is something being studied for peer review and an upcoming conference. The challenge is this:
Lots of people in research gate have their own and unique theories. There's lots of technical discussion about the technicalities which may invalidate these theories, but in reality, invalidating or authenticating theories such as this is a highly rigorous process more akin to hard mathematics than it is to anything which can be put verbally.
Below is a comment for context and brevity. It was posted under a research gate paper.
After this context, a few standards for authentication will be laid out, and I would challenge anybody with their own theories to attempt to meet these standards.
Context:
"would like to extend congratulations to John, as anyone performing these calculations will also see as I did, that this theory is easily renormalizable at one Feynman loop, by my current calculations.
Anybody else who can verify this as well. It's either exactly at one loop, or around there, indicating high stability in versatile QM/GR scenarios and means it handles infinities that other popular theories such as String Theory Struggle with exceptionally well, among other implications.
It also means the applications of the actual elements of the framework structure are easily adaptable in many scenarios traditionally difficult, I.E, Early birth of the universe, large rotating black holes, ECT
This evidencing, that is part of a small group of theories, such as qed, string theory, loop Quantum gravity, and many others recently emerging, which have indicated high authentication rates for rigorous academic standards of how this would historically be addressed.
I'm investigating also, another colleague who seems to have a very robust framework indicating a similar confluence of being normalizablity, with just one Feynman loop also being currently calculated to renormalize his theory, this of course will take additional analysis from people beyond me, in the spirit of peer review.
We all need to remember that unilateral acceptance of a theory is unlikely due to the decentralized networks contrasting with what's allowed a theory such as quantum mechanics in general relativity to propagate.
I fully believe, that there is a range of unified theories possible, all based off of competent identification of similar mathematical principles, and general principles, with a range of uses and complexity all adhering to these principles based off of personal development and usage needs.
Zero sum thinking it is absurd in this matter, attempting to invalidate theories such as this based off of a small and minor inconsistency does not hold up to rigorous academic standards of how one would systematically and historically address how a theory could be considered a functioning Unified theory.
This type of thinking, with the cognitive dissonance that continues to refuse acknowledging that even theories like quantum mechanics and general relativity have inconsistencies and are still very valuable.
We could pretend to invalidate these Frameworks based off of a small and general technicalities as well. But this would be foolish, which is why the zero sum thinking is the bane of science. Imagine, if a logic of a small inconsistency in validating an entire framework, such as his common here on Research a, was applied to General relativity, seeing his quantum mechanics was already prevalent at the moment it came out.
Fact of the matter is, authenticating unified theories boils down to something more akin to hard mathematics, and cannot be invalidated by simple verbal English phrases of potential technical inconsistencies. It's far more advanced, and complicated than that, and no matter what you say, you're not going to be able to invalidate or supersede at the mathematical authentications needed to validate theories such as this,.
Again, if you apply that logic to conflicting theories such as quantum mechanics in general relativity, the argument becomes an inherently illogical. Especially if any point made to argue this is based on quantum mechanics or general relativity.
The dissonance, of when it is acceptable to ignore a certain technicalities, and when it isn't, based off of what other people are championing, is beyond ridiculous. If you apply this to even the inconsistencies and quantum mechanics in general relativity, you can pretend that all of our advancements in these areas in the past 40 years didn't matter.
We act as though just because things like certain inconsistencies in Quantum mechanics, General relativity, and string theory, some of the most major theories done integrate, that they're not still utilizable and good efforts. Seems to be a dissonance, and when this logic is applied to popular theories versus one that is developed by somebody less or known, or a less widely accepted theory.
There will not be one, once we start seeing the greater Mosaic of understanding will all move forward.
such as String Theory Struggle with, among other implications."
In lieu of this, here are the standards I challenge people to meet, when attempting to authenticate their own theories, and post the results here if you want to:
Computational Verification:
1. Numerical Simulations:
Use of computational models to simulate theoretical predictions and compare them against experimental data.
Algorithmic Consistency: Ensuring that the algorithms used in simulations and calculations are robust and produce consistent results.
2. High-Energy Experiments
Particle Colliders: Utilizing facilities like the Large Hadron Collider (LHC) to test predictions about particle interactions at high energies. It's going to also include matching up with their data from repositories.
Detector Sensitivity: Ensuring that detectors are sensitive enough to observe rare or subtle phenomena predicted by the theory.
3. Standard Classical Experiments
Reproducibility: Experiments must be reproducible by independent researchers under the same conditions.
Precision Measurements: High precision measurements to test the predictions of the theory, such as those in electromagnetism and gravity.
4. Quantum Verification
Wave Function Analysis: Verifying that the theory’s predictions about quantum states and their evolution match experimental observations.
Entanglement and Superposition: Testing predictions about quantum entanglement and superposition through experiments like the double-slit experiment.
5. Computational Authentication
Feynman Loop Calculations: Performing and verifying one-loop (and higher-loop) Feynman diagram calculations to ensure the theory is renormalizable.
Normalization Data: Comparing the renormalization data against known standards to ensure consistency.
Feynman Loop Validation: Validating the theory through detailed calculations of Feynman loops to ensure mathematical consistency.
6. Logical and Theoretical Framework Consistency
Group Theory and Symmetry: Ensuring that the theory adheres to established symmetries and group structures, such as those in the Standard Model (e.g., SU(3)xSU(2)xU(1)).
Lorentz Invariance: Maintaining Lorentz invariance in the regions where special relativity holds
Predictive Power and Experimental Validation
Predictions of New Phenomena: The theory should predict new phenomena that can be tested and potentially falsified by experiments.
Data Compatibility: Predictions must be compatible with existing experimental data, and any deviations must be accounted for and explained.
7. Peer Review and Publication
Publishing in Reputable Journals: The theory must be published in peer-reviewed journals where it can be scrutinized by the scientific community. Alternatively, to avoid gatekeeping, this can be done by simply consulting with experts in the field, which are within your peer network, and having them verify the work in some sort of documentable way.
Transparency and Collaboration: Maintaining transparency in methods and data, and encouraging collaborative efforts to test and validate the theory.
8. Research-Based Comparison
Comparison Against Known Models: Conducting research-based comparisons against known models that have low loop consistency to highlight the advantages of the new theory.

Related Publications

Chapter
The terms ‘modal’ and ‘modality’ admit of two kinds of qualification. On the one hand, the terms can be qualified by being restricted to the alethic range or to the non-alethic range, such as in the cases of deontic modality and epistemic modality. On the other hand, within the range of alethic uses, the terms can be restricted to the logical, meta...
Chapter
This chapter describes the empiricism, rationalism, and special relativity. Rationalism maintains that the knowing subject plays an active role in the formation of scientific theories. Reason provides with an insight into those pervasive features of existence which could not possibly be different from what they actually are. Experience is fragmenta...
Article
This paper describes a lay course on general relativity (GR) given at the Osher Lifelong Learning Institute at Florida International University. It is presented in six hour?and?a?half weekly sessions. Other courses offered by the author include special relativity (which precedes the course described here), quantum theory, and cosmology. Students ar...
Got a technical question?
Get high-quality answers from experts.