Science topics: PhilosophyLogic
Science topic

Logic - Science topic

An open group for the discussion of various logics and their applications
Questions related to Logic
  • asked a question related to Logic
Question
7 answers
It is common to affirm that "One can never perform any measurement whose result is an irrational number."
This is equivalent to say the contrapositive, that anything that can be measured or produced is a rational number.
But the irrational number √2 can be produced to infinite length in finite steps, as 2×sin(45 degrees). It also exists like that in nature, as the diagonal of a unit square.
There is no logical mystery in these two apparently opposing views. Nature is not Boolean, a new logic is needed.
In this new logic, the statements 'p' and 'not p' can coexist. In the US, Pierce already said it. In Russia, Setun used it.
This opens quantum mechanics to be logical, and sheds new light into quantum computation.
One can no longer expect that a mythical quantum "analog computer" will magically solve things by annealing. Nature is also solving problems algebraically, where there is no such limitation.
Gödel’s undecidability is Boolean, and does not apply. The LEM (Law of the Excluded Middle) falls.
What is your qualified opinion?
Relevant answer
Answer
More primary than logic, one can look at mereology. There, wecan discern too factors governing uncertainty: percetual and conceptual. Then, we can we can resolve four chacteristics of uncertainty -- they can be combined, first, in a Boolean mixture. This can be very illuminating, creating four classifications for any phenomena, thus, a non-Boolean system is born. I detail this in my publications, and allows text and math analysis very well. The four classifications are called: DAOF.
When trust is added, we get a semiotics system with reference (symbol), sense (meaning), referent (object), and trust. Trust is what makes the DAOF classification possible.
This opens up uncertainty, as a mixture, using two possible systems, from mereology and from semiotics.
Therefore, one can begin to explore uncertainty, and understand the relativity of Gödel's and Heisenberg uncertainties. It is not just dependent on the perceptual, but the conceptual. It is easier to deal first with the conceptual uncertainty.
We do not think and talk about what we see; we see what we are ABLE to talk about, cf. Edgar H. Schein.
Language is more important than we think. Russians can distinguish more tones of purple, because they have words for it. Uncertainty is reduced.
  • asked a question related to Logic
Question
2 answers
knowing that the acid concentration and the temperature are optimal.
Increasing the reaction temperature from ambient to 80°C increased the leaching yield
Relevant answer
Answer
To the best of my knowledge increasing the stirring speed increases the leaching yield when the leaching is diffusion controlled. Increasing the stirring speed has no effect on the leaching yield when the leaching is chemical reaction controlled. So increasing the stirring speed can never have a negative effect on a leaching reaction.
  • asked a question related to Logic
Question
10 answers
Right now, in 2022, we can read with perfect understanding mathematical articles and books
written a century ago. It is indeed remarkable how the way we do mathematics has stabilised.
The difference between the mathematics of 1922 and 2022 is small compared to that between the mathematics of 1922 and 1822.
Looking beyond classical ZFC-based mathematics, a tremendous amount of effort has been put
into formalising all areas of mathematics within the framework of program-language implementations (for instance Coq, Agda) of the univalent extension of dependent type theory (homotopy type theory).
But Coq and Agda are complex programs which depend on other programs (OCaml and Haskell) and frameworks (for instance operating systems and C libraries) to function. In the future if we have new CPU architectures then
Coq and Agda would have to be compiled again. OCaml and Haskell would have to be compiled again.
For instance suppose a formal mathematics Agda file started with
{-# OPTIONS --without-K --exact-split #-}
Both software and operating systems are rapidly changing and have always been so. What is here today is deprecated tomorrow.
My question is: what guarantee do we have that the huge libraries of the current formal mathematics projects in Agda, Coq or other languages will still be relevant or even "runnable" (for instance type-checkable) without having to resort to emulators and computer archaeology 10, 20, 50 or 100 years from now ?
10 years from now will Agda be backwards compatible enough to still recognise
current Agda files ?
Have there been any organised efforts to guarantee permanent backward compatibility for all future versions of Agda and Coq ? Or OCaml and Haskell ?
Perhaps the formal mathematics project should be carried out within a meta-programing language, a simpler more abstract framework (with a uniform syntax) comprehensible at once to logicians, mathematicians and programers and which can be converted automatically into the latest version of Agda or Coq ?
Relevant answer
Answer
Clarence Lewis Protin `Writing "literate" .agda files in which code is mixed with an informal mathematical presentation of the formalisation seems the best option !`
There are software development practices akin to this. In the open-source projects that I have participated in (other than my own), comments are eschewed. There are variations on Literate Programming that some adopt, although it requires more mechanical support. In software, commentary helps differentiate what the software is for in contrast to what it is, establishing how the software is an interpretation of some conceptual model.
The risk used to justify the elimination of commentary is the problem of maintaining consistency between the commentary and the code as it is developed/maintained.
I have no objection to this approach, nevertheless. I find the commentary important for my own recollection of the purpose of some code and confirmation that the code accomplishes that.
I do not see how this methodology is a solution to the problem raised in this question though, unless it would be useful in maintaining the proof in the face of breaking changes up-level or down-level. Please say more about that.
  • asked a question related to Logic
Question
4 answers
Quarries have an important economic and social role, but they have repercussions on the natural framework (the environment), which calls for proposing alternative solutions to ensure the supply of the construction and public works sector with basic materials, taking into account the environmental dimension and the sustainability of resources in accordance with the logic of governance in the measure.
Relevant answer
Answer
You can be benefitted from the case study attached here if you want to find policy intervention on the issue of sand extraction and its environmental, bitleratal and political issue
  • asked a question related to Logic
Question
5 answers
For example, when the instrument has been logically validated by some experts about its content validity, and then the instrument will be empirically validated (e.g. scale item validity) to several respondents, should the researchers determine the population of specific groups of people (e.g., in certain district) and concern about the sampling method? Or the instrument could be randomly distributed to anyone without paying attention to specific population?
If the instrument is valid empirically concerning about the population to which the instrument is distributed, could the researcher claim that the instrument is valid in that population and maybe the instrument is not valid in other population?
Relevant answer
Answer
Validity is often assessed quantitatively in terms of correlation coefficients, and yes, it can involve inferential statistics to test whether the relevant correlations are "statistically significant." For example, in Campbell and Fiske's (1959) multitrait-multimethod (MTMM) approach to construct validation, correlations are used to examine convergent and discriminant validity. One of Campbell and Fiske's (1959) criteria for convergent validity is that the relevant monotrait-heteromethod correlations be statistically significant thus involving inferential statistics.
I would agree that having validated a measure in one population does not mean that the instrument is also valid in another population. Each distinct population may require a separate validation.
Reference
Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81–105. https://doi.org/10.1037/h0046016
  • asked a question related to Logic
Question
12 answers
On the graph of changes in air temperature, one can see that after an increase in air temperature by about 1 degree since 1997 and a decrease in air temperature in 1999, the temperature trend has changed. Trend changes were recorded immediately after El Niño in 1998. This is logical, since El Niño affects the planet's climate. But the level changes in the Atlantic and Pacific Oceans after the rise and fall of the sea level during El Niño in 1998, the level change trend has continued. The trend change occurred with a delay of several years, around 2004. Graphs taken from the Internet in the public domain.
Relevant answer
Answer
The problem is indeed quite complicated because of the numerous factors influencing the bottom of the ocean shape including the post glacial rebound. This issue was discussed over twenty years ago and we have currently several good references, please kindly consult Long et al. (2006): Post Glacial Rebound Models and Relative Sea-level Observations from East Greenland; the paper is available on the RG platform.
  • asked a question related to Logic
Question
9 answers
If you consider evolution law "fitest survive" it means what survive is traits are labeled fitest just because it did (empirical basis, not logical). Similarly, dynamics law means
"what accelerates is forcive" i. E earth(gravity), human (muscular).
In this case, care was taken by Newton to avoid this objection. Force is action and motion was considered action in Aristotle (and Common sense). By he made it passive (inertia property) and had a descriptive field devoted to it (kinematics) so that it Will not appear as action.
Relevant answer
Answer
It is arguable that many informative scientific statements are contradictions. Take the idea of "instantaneous velocity." The two words are in deep tension with each other, if maybe not a direct contradiction. It depends on how sensitively we want to inquire. The instantaneous velocity vector is a direction and a magnitude, and the use of the word magnitude instead of speed is rather flimsy. Use of the word speed would make it clear that there can be no speed with no time to move. I offer that students tend to feel some dissonance and doubt about this idea... and my teachers' usual response was "get used to it, this is the truth."
Well, I believe a better way is to frame this problem as a question. A question is a proposition that is not true or false, not both and not neither, which is the kind of proposition you need for something that is in deep tension. Also, it just feels better to ask "Can we describe a state of motion the passage of time? If so, how?"
  • asked a question related to Logic
Question
9 answers
I do not find many astrophysicists and cosmologists engaging themselves with the question of the probable origin or origins of the cosmos. Even when a work is purportedly to be on this theme, the authors beat around the bush so much so that they have finally time only to conclude with some suggestions. These suggestions are based only on the kind of reason that experimental astrophysicists use, and not on the general logic emerging from and beyond the cosmological theories...! What would be the philosophical and cosmological reasons behind this diffidence of scientists?
Raphael Neelamkavil
Relevant answer
Answer
Berndt Barkholz, thanks for the short intervention. Let me try to give a reply:
(1) We will never be satisfied with discovering all that belongs to the physics of the world of things near us. Cosmology is based on the same physical and astrophysical experiments and researches that we make on earth, but by extending them to the case of the whole cosmos.
Thus, e.g., when we extend the Principle of Conservation of energy and momentum to the case of the whole cosmos, we can very well -- with enough rational surety stemming from the Special (and General) Theory of Relativity -- conclude that matter and energy are completely inter-convertible in all universes other than ours. The explanation for this may be purely "this-worldly-logical", but we have at least that!
(2) As per your example of the "1+1+..." method of educating children in arithmetic, the above manner of concluding should already belong to the theoretical and partially to the experimental elements (not advances or details) of physics today.
In this manner, we can extend to cosmology the many thermodynamic and other physical and astrophysical results which we consider as sure enough. How to do it? This will be discussed after (3) below.
(3) Add to the above sort of extension of results also the basic physical constants. These constants need not be valid for all the worlds that might exist beyond our so-called "big bang" world. These constants and related results need not even be exactly what the results so far have shown us to hold as, with respect merely to our universe.
But to every group of wider gravitational realms of influence (I call them "gravitational coalescences") in the world, there must be some finite Planck's constant, Hubble's constant, etc., however differently they are formulated. Assume along with it also that the gravitational constant is variable from world to world or from groups of globular clusters to other such groups. (Note: In fact, such are the kind of questions that critics of big bang, black hole, gravitational waves, and other theories make!)
We can assume that there SEEM to be some such constants, which we can only theoretically calculate and experimentally measure approximately. We can very well accept this state of affairs as belonging at least to our human limit situation! Nevertheless, we ARE doing science!
(4) Further, we can assume (very rationally, please!) as sufficiently true many theoretical results in physics and cosmology, especially by applying the general logic of reduction (removal) of all doubtful cases of realization, and by accepting at least the extremely sure cases of the values of matter-energy content, spatiotemporal coverage etc. in our cosmos (not limited to our own universe).
(5) This is the MMM method that I have formulated in a little book of less than 100 pp., meant for beginning and advanced students of astrophysics and philosophy of physics: ESSENTIAL COSMOLOGY AND PHILOSOPHY FOR ALL, 2022, KDP Amazon, and costs only 4 to 5 dollar. (Please do not feel that I am publicizing my book here. I had to mention the book in this context! I have kept it at the lowest price on purpose. I get only 7 cents as royalty on each sale!!)
The MMM method has been formulated in order to show that we have broader than strictly particularistic-logical and experimentalist-logical realm of rationality. All forms of the latter two are in fact based on the more general notion of rationality, the logic of which needs formulation in the course of time.
(6) I BELIEVE THAT BY USING THE MMM METHOD WE CAN FORMULATE VERY GENERAL THESES ON THE COSMOS.
"IF THIS WERE THE CASE, HOW THEN COULD THE REMAINING OR RESULTANT CASES BE CONCEIVED LOGICALLY?" "ASSUMING THAT ... (such and such) ..., WHAT SHALL NOW BE CONCLUDED AS THE SURE CASE (the sure case, please!)?" etc. are some of the ways of reasoning here.
(7) ONE MAY NOW ASK: IN THAT CASE DO WE OBTAIN ANYTHING MORE THAN MERE PROBABILITIES IN SCIENCE? My answer will be: "As if we were so far obtaining absolute truths!" If anyone suggests that we obtain some absolute truths with all possible explications of these truths, I WOULD KEEP QUIET AND NOT SPEAK WITH THAT PERSON....
Even in math and logic just circumscriptions are the case. Take a look at the plight of the Hilbert Program and study what Gödel has achieved and brought in a revolution in logic and mathematics, and of course also in epistemology, metaphysics, etc.
NOW MY WONDER IS THIS: Inspite of all the uncertainties involved even in determining whether, for all possible cases (and worlds), '1 + 1 = or < or > 2', what are we wait for? We need to do what is in our capacity. None should prohibit us from positing the questions of the possible amount of matter-energy (infinite, finite, or zero) in the cosmos, possible extent of measuremental "space-time" thinkable in the cosmos (again, infinite, finite, or zero) and think by use of these....
Berndt Barkholz, I consider you as a mature scientist. I want to tell you that I did not try to belittle your arguments. I have tried to show you another aspect of these arguments. Bitte fühlen Sie sich wohl, weiter mit mir (und den anderen hier) diese Diskussion zu halten.... Bin offen für Kritiken. Danke dir herzlich!
Raphael Neelamkavil
  • asked a question related to Logic
Question
1 answer
How can i generate the rules matrix of a fuzzy system that has two outputs and the outputs are logically combined using the "and" operator. (In MATLAB envirement)?
p--->q and r
The fuzzy system has 3 rules like above. The membership functions are triangular, and the linguistic variables are "Low", "Medium" and "High".
Relevant answer
Answer
fis = mamfis;
fis = addInput(fis, [0 1], 'Name', 'p');
fis = addMF(fis, 'p', 'trimf', [-0.5 0.0 0.5], 'Name', 'L');
fis = addMF(fis, 'p', 'trimf', [ 0.0 0.5 1.0], 'Name', 'M');
fis = addMF(fis, 'p', 'trimf', [ 0.5 1.0 1.5], 'Name', 'H');
fis = addOutput(fis, [-1 0], 'Name', 'q');
fis = addMF(fis, 'q', 'trimf', [-1.5 -1.0 -0.5], 'Name', 'NB');
fis = addMF(fis, 'q', 'trimf', [-1.0 -0.5 0.0], 'Name', 'NM');
fis = addMF(fis, 'q', 'trimf', [-0.5 0.0 0.5], 'Name', 'NS');
fis = addOutput(fis, [0 1], 'Name', 'r');
fis = addMF(fis, 'r', 'trimf', [-0.5 0.0 0.5], 'Name', 'PS');
fis = addMF(fis, 'r', 'trimf', [ 0.0 0.5 1.0], 'Name', 'PM');
fis = addMF(fis, 'r', 'trimf', [ 0.5 1.0 1.5], 'Name', 'PB');
rules = [...
"p==L => q=NB, r=PS"; ...
"p==M => q=NM, r=PM"; ...
"p==H => q=NS, r=PB"; ...
];
fis = addRule(fis, rules);
figure
subplot(211)
opt1 = gensurfOptions('OutputIndex', 1, 'NumGridPoints', 21);
gensurf(fis, opt1), grid on
subplot(212)
opt2 = gensurfOptions('OutputIndex', 2, 'NumGridPoints', 21);
gensurf(fis, opt2), grid on
  • asked a question related to Logic
Question
12 answers
This is a question about Godel Numbering. As I understand it, the axioms of a system are mapped to a set of composite numbers. Is this really the case, so for example the 5 axioms of Euclidean plane geometry are mapped to 5 composite numbers? Does this also imply that theorems of the system are now composite numbers that are dependent on the composite numbers that were the target of the map from the set of axioms PLUS the elementary numbers that describe the logical operations, such as +, if..then, There exists, ext.?
Relevant answer
Answer
From what I understand, you are asking whether it is sensible to investigate the nature of numbers coding axioms (finitely many) of a given theory, or more generally, study a formal theory with respect to its coding and see whether it is possible to "extrapolate" from theorems provable in this theory a number representing a code of a formulation of an axiom. It has already been said that the Gödel numbering used in the proof of the Incompleteness theorem is canonical and probably given its computational inefficiency maybe also inadequate for this particular question (again, assuming that I understood your question correctly).
However, the main idea of understanding how theorems are, in some sense, computationally linked to the premises which entail them is a very interesting question. It is clear that the possibility of arithmetizing a formal theory enables to provide a number theoretical interpretation to the objects of a formal theory (syntax, semantics and proof theory). In this sense, one could ask whether it is possible to find a "preferable" coding for a formal theory in order to make your question easier to formulate and answer. This is of course extremely vague, but to make things more precise, you could try to pick a specific theory (Euclidean Geometry), formalise the axioms and define different codings and see whether, with respect to what you are trying to investigate, it is possible to establish how a particular choice of coding affects the nature of the numbers coding your axioms.
Finally, I wanted to conclude by referring to a recent program in mathematical logic called Reverse Mathematics. They are basically trying to isolate axioms from theorems (which is some sense related to the remark you made "[..]going in the other direction, starting with a result in number theory and then try to conclude something about formal systems or axioms.") and they actually use a lot of computability theory and subsystems of second-order arithmetic. I am not an expert on this particular topic, but I suggest you to look into it if you are interested.
I would be happy to carry on this discussion with you and also, if during the past year you obtained some interesting results that you can share, I would be glad to hear them.
Thank you and best regards!
Jean Paul Schemeil
Links for Reverse Mathematics:
Simpson, Stephen G. (2009), Subsystems of second-order arithmetic, Perspectives in Logic (2nd ed.), Cambridge University Press, doi:10.1017/CBO9780511581007, ISBN 978-0-521-88439-6, MR 2517689
  • asked a question related to Logic
Question
2 answers
Good afternoon Everyone.
I hope this message finds you well.
I am writing this message to ask for your help.
The Ls-Dyna software has been installed on my desktop. However, I could not use the maximum capacity of this desktop. This new desktop has 16-cores with 24 logical processors (see the attached picture 1). However, when I chose to use 8 or 16 NCPU to run the model, an error has been received as shown in the attached picture 2.
Is it possible that you can show me the way to increase the "OMP_NUM_THREADS" on my desktop?
If you need further information, please let me know.
Many thanks!!!
Relevant answer
Answer
Hi Cheng Xu,
Thank you for your help.
I will do it now.
Have a great morning.
  • asked a question related to Logic
Question
5 answers
Can we use methodology from quantitative to qualitative? What kind of logic can we give in this scenario?
Relevant answer
Answer
Dr Priyanka Malik Combining qualitative and quantitative market research does not require a multi-stage procedure to be beneficial. You may simply combine the two approaches to acquire a deeper understanding of certain problems.
  • asked a question related to Logic
Question
1 answer
Putnam critisized logical positivists acounts of the meaning of scientific terms and nature of scientific theories because they were incompatible with minimal scientific realism. However, before discounting antirealism as valud stance he leaned toward it in his internal realist years, the view that epistemic version of truth is the most valid.
Relevant answer
Answer
There are many theses called "scientific realism" and many called "scientific anti-realism." In principle, they refer to the acceptance or rejection of the existence of unobservable objects postulated by scientific theories, and in this sense, this classification is metaphysical or ontological.
On the other hand -I am not talking about Putnam, but about the texts he wrote -, different texts by such an author affirm different things.
But "truth" is a semantic term, which refers -in its most usual meaning, which is the one used by science and natural language- to a relationship called "correspondence" between language with a descriptive function and what it alludes, so it is convenient to treat those topics separately (they are not logically linked).
If one speaks of scientific realism, then there are many positions on the matter, but I believe that any analysis must take into account, on the one hand, the scientific realism assumed by the scientific texts themselves: when a biology book talks about mitochondria or a physics speaks of positrons, they assume the existence of those things beyond language.
Second, the natural language we are using is realistic too, and assumes the scientific realism of science itself (that atoms and viruses exist, for example). And if it is about philosophical arguments about science, things get complicated there, because there are many versions and positions.
However, Putnam's arguments never differentiate between epistemological claims about human knowledge in general and those related to scientific knowledge, which is why I consider his theses are not very useful for the philosophy of science (I don't know for epistemology).
  • asked a question related to Logic
Question
3 answers
What are the logics behind this?
Relevant answer
Answer
Hi,
Please go through the previous discussion and you can also read the link below link
  • asked a question related to Logic
Question
46 answers
QM stands for quantum mechanics, CS for computer science, CC for cellphones/computers, and Mathematics seems to have somehow missed all three.
Now is the time to catch up. We have come to a stand off. Students are ending up neurologically sick -- feeling math-averse (neurophobia and math trauma).
It is not a matter of selection. Only group/rote work [1] seems to infuse Mathematics, as taught today, even in highly-qualified students. Paradoxically, one finds that the more intellectually qualified, the more averse!
And yet current Mathematics still tries to "instruct" it to the students, and other disciplines, in the US and the world.
This failure has been denied [1], and the “fault” has been put on the victims -- the students. Paradigm shifts are needed: first, QM.
According to QM, only integer numbers should be necessary to build all one can see in nature. Otherwise, all of Physics would be contradicted. This would also contradict all of CS and all of CC. Computers, for example, only work with integers, and are error-free.
Since this contradiction is not even imaginable, QM is ontologically correct. Therefore, all sciences must reflect it, according to a "holographic principle" (HP) in nature, including Mathematics.
The HP is often referred to as "the micro as in the macro" or vice-versa.
These two words macro and micro are commonly viewed as antonyms, meaning that they are the opposite of each other. Macro means on a large scale. Micro means on a very small scale. Both are important, though often complementary, views of nature. The view from the macro, while taking into account the micro, is called "universality" in Physics.
The QM main principle was given by Niels Bohr as, "all states at once". This needs to be understood as different from the following possibilities:
  1. "Copenhagen interpretation", or
  2. a "collapse" of the quantum function upon measurement, or
  3. waves, as a picture of QM, or
  4. wave-particle duality, or
  5. the Heisenberg principle, or
  6. a probabilistic view of QM.
In opening the "black box" of QM as viewed by Bohr, QM does not represent Natur (defined by the philosopher Kant), but Wirklichkeit (ditto). It is not how nature is, but seems to work. A story.
A known analogy is the Plato's Cave". The shadows one can see on the wall are Wirklichkeit, and the open reality outside is Natur. After watching Wirklichkeit for some time, people can make an idea what Natur looks like, even though imprisoned in the Cave.
QM is, ontologically, how one can describe Wirklichkeit -- which is the subject of Physics. Natur may not have QM behavior, and continuity may exist in Natur.
Mathematics is not concerned with Wirklichkeit only, but can include Natur. Although, as no one can see Natur, one is led to treat any "pure" Mathematics as a speculation.
Albeit, Mathematics must agree with Physics in the realm of Wirklichkeit.
This does not happen today, and creates a clash. The inconsistency is seen as an opening. What IS mathematics?
Today's calculus teaching can be seen as relying on outdated ideas, going back before QM, computer science (CS), and cellphones/computers (CC) were discovered. Three major paradigm shifts seem to have been missed.
But these three paradigm shifts represent how science is done, in our seemingly endless task in going from Wirklichkeit to Natur.
One recognizes then, that science does not behave continuously, but by jumps -- that break from the past, and open a new future -- in going from Wirklichkeit to Natur, and back -- does the envisioned Wirklichkeit reproduce the actual Wirklichkeit?
This has been called a "paradigm shift" (PS) by T. S. Kuhn in Structure of Scientific Revolutions [2].
David Hilbert [3], at the turn of the XX century, proposed twenty-three problems intended to guide research in the dawning century, claims otherwise, that “History teaches the continuity of the development of science.”
Michael Harris [4], a mathematician, writes that he, "would still be glad to lift the veil, but we no longer believe in continuity. And we may no longer be sure that it’s enough to lift a veil to make our goals clear to ourselves, much less to outsiders."
The outdated ideas currently in Mathematics date back to the time of Newton, Leibnitz, and Cauchy, before the 3 PSs mentioned. They include (aka Fictions, or incorrect models): microscopic continuity, infinitesimals, hyper-reals, Cauchy epsilon-deltas, and Cauchy accumulation points.
Even in 4-year universities one finds classes, such as at Caltech, MIT, CSU, and abroad, teaching Fictions today [1].
This failure can be denied [1], and the “fault” can be put on the victims -- the students. But, there is no feeling of inferiority. PS [2] combats this, American style -- by innovation.
A technical innovation can reduce this gap by a PS [2] -- already heralded and error-free, as one can see in the history of sciences (contradicting David Hilbert [3]).
This reduces risk by following an experimental model that works, even though no one has explained it mathematically, without Fictions.
By jump-starting to QM one has a solution, which we mark as a first paradigm shift.
Where Fictions interfere with known Physics, they must be mercilessly deprecated -- although, and yet, current Mathematics still tries to "instruct" it to the students, in the US and the world.
The other paradigm shifts, stand for computer science (CS), and cellphones/computers (CC).
We now introduce a coherent, holographic principle (HP), a universe, and understanding yet to be discovered, as two new paradigm shifts -- and gain a cosmic perspective: who is the Creator of all this marvelous scheme?
This final PS leads to a comment. One does not need to learn anything, confirms https://www.researchgate.net/profile/Robert-Fuchs
One just has to observe.
Mathematics seems, thus, to be discovered — as a hologram and coherent with other sciences, and not, somehow, invented by mere humans.
What is your qualified opinion?
REFERENCES
[2] T. S. Kuhn, Structure of Scientific Revolutions, 1962.
[3] David Hilbert, Paris International Congress of Mathematicians (ICM), 1900.
[4 I] Michael Harris, “Mathematics Without Apologies”, Princeton University Press, ISBN 978-0-691-1-17583-6, 2017.
Relevant answer
Answer
Dear Ed Gerck
Thank you very much for asking this question. The paths of discovery in mathematics are indeed surprising as we see from the following observation concerning the convergence of ideas of Coxeter and Bourbaki discussed in the following blog:
I think that the above discussion provides a positive answer to your question and underlines how important is to know the history of mathematics.
  • asked a question related to Logic
Question
1 answer
Hello friends, i want to know what is the difference if the lag return is giving positive coefficient value in mean equation and negative coefficient value in variance equation? Can someone tell me what's the logic behind this?
Thank You
Relevant answer
Answer
I'm sorry, I don't have the expertise.
  • asked a question related to Logic
Question
6 answers
Seriality on certain modal frames - that for every world w there is some world v such that wRv - corresponds to the axiom: ◻P→♢P, assuming a standard interpretation of the relevant operators. Consider what we might call reverse seriality: that for every world w there is some world v such that vRw.
Is there an axiom to which reverse seriality corresponds, assuming a standard interpretation of the operators? I've observed any frame satisfying reflexivity, i.e. every w is such that wRw which corresponds to the axiom: ◻P →P, satisfies reverse seriality, but I've gotten no further on the answer.
(P.S. If this is a 'good question' let me relay that I heard it from a student who took a course with Sean Ebel-Duggan who posed it; on the other hand, if this is a 'bad question' then it's likely due to my or the student's misunderstanding of Sean's actual question).
Relevant answer
Answer
For the modal signature it is impossible, since this first-order condition is not preserved under taking generated subframes. However, if we allow tense operators then Hp->Pp should be the axiom where H is the backward looking box and P is the backward looking diamond.
  • asked a question related to Logic
Question
12 answers
So A is a further version of B, and people said, "If you did B, why not A"
Example: "If you decide to pay for her meal, why not just pay for her everything?"
It sounds like straw men and slippery slope, but I don't think they are either.
Relevant answer
Answer
Because we are dealing with informal logic, fallaciousness will depend on context and is therefore a matter of pragmatics. If you regard the question and accompanying example as a kind of enthymeme then it will be possible to make explicit the tacit assumptions that make the argument valid; that is basically what you're doing with your third option.
I had said "on the face of it, fastening and telling are disjoint properties", which leaves open the possibility that more might be said. It would be interesting if you could repair your original example with a redescription that would make fastening and telling come out as instances of a property such that one of them amounts to more or less of it than the other.
  • asked a question related to Logic
Question
5 answers
Hello, I am new to R so a little stuck. I have 164 items as part of a scale development project (n=1271), and I want to set a cutoff of .40 for factor analysis. I tried using logical operators and used this script
loload <- tempdf1 %>% filter(Q1.0<.40) which set up the new datafile for 'loload' but didn't put any data in there. I then tried using this script with all 164 items separated by a comma, which returned an error message.
I'm quite stuck; numerous google searches don't offer a lot unless I want to do things to one specific variable.
Any help is appreciated.
Relevant answer
Answer
Not sure I understand your question, and you have used code from a package without naming it, not shown what things like Q1.0 are, and not included a minimal working example, so you should do this. Say is your 164 items are in a matrix X and you want all values less than .4 turned missing,
X[X < .4] <- NA
But you have thrown the phrase "factor analysis" into your question as if that it relevant. It might make sense to delete this question and start over in order to get more useful responses.
  • asked a question related to Logic
Question
1 answer
Is it time for the update of terminology, to reflect logically, is it as simply already what it is?
Could there be a better term to better reflect?
Booking: Arrange for and reserve in advance, engage in service, and employment for a limited time period.
Relevant answer
Answer
Is 'Advance registration' or 'Register in Advance' enough?
  • asked a question related to Logic
Question
3 answers
If we some how transform a Binary Search Tree into a form where no node other than root may have both right and left child and the nodes the right sub-tree of the root may only have right child, and vice versa, such a configuration of BST is inherently sorted with its root being approximately in the middle (in case of nearly complete BST’s). To to this we need to do reverse rotations. Unlike AVL and red black trees, where roatations are done to make the tree balanced, we would do reversed rotations.
I would like to explain the pseudo code and logical implementation of the algorithm through the images in following pdf. The algorithm is to first sort the left subtree with respect to the root and then the right subtree. These two subparts will be opposite to each other, that is, left would interchange with right. For simplicity I have taken a BST with right subtree, with respect to root, sorted.
To improve the complexity as compared to tree sort we can augment the above algorithm. We can add a flag to each node where 0 stands for a normal node while 1 is when the node has non null right child, in the original unsorted BST. The nodes with flag 1 have an entry in a hash table with key being their pointers and the values being the right most node. For example node 23's pointer would map to 30.5's pointer. Then we would not have to traverse all the nodes in between for the iteration. If we have 23's pointer and 30.5's pointer we can do the required operation in O(1). This will bring down time complexity , as compared to tree sort.
Please review the algorithm and give suggestion if this algorithm is of any use and if I should write a research paper on it.
Relevant answer
Answer
This reminds me of the idea of quicksort where we take a pivot element and "move" all elements smaller to the left of the pivot element and all other elements to the right (and then do this recursively).
If we take the pivot element as root node of the tree we would get a somewhat similar result (they become more and more similar as quicksort advances).
Ok, but we can move stuff faster in the tree as we can move entire subtrees.
Two thoughts on that:
1. not 100% sure but one might find a worst case where each subtree have to be moved/sorted (I guess)?
2. quicksort works on a plain array, the tree must be "constructed". As inserting takes O(log n) (wikipedia) and we have to insert all n elements, we would get time O(n * log n) to insert them all. Quicksort also takes O(n * log n) time (worst case is n^2 but e.g. mergesort worst case is n * log n).
  • asked a question related to Logic
Question
4 answers
For various reasons it can sometimes become necessary to change the mindset, to change our attitude to something, eg following trauma or illness. We can re-examine our beliefs with reasonable logic and be successful in turning a negative mindset into a positive one. However, how do we do that without our emotions and misinterpretations of the world getting in the way?
Relevant answer
Mindset means improve my skills with effort and practice. So, I have to think positively, that is to say, you have to say to yourself "I can do it".
Best wishes
  • asked a question related to Logic
Question
2 answers
Which beams will yield and form plastic hinge first? beams located on the lower half of the building or beams on the upper part of the building? top storey is subjected to greater lateral force compared to bottom storey. However logically speaking, it seems that the bending of the structure will occur from the bottom first ( as if treating the whole structure as cantilever), and the upper portion just moves along with it, hence lower moment and less likely to experience yielding. Is it correct?
Relevant answer
Answer
Check demand capacity ratio of all beams under a hazard level used for analysis. the beam which has high demand/capacity ratio will yield first.
  • asked a question related to Logic
Question
1 answer
I ee
Relevant answer
Answer
Hi
If you need to integrate MATLAB code into Simulink, use the function block in the Simulink library. If the code contains elements of the object type, use Level-2 MATLAB S-Function
  • asked a question related to Logic
Question
4 answers
I wrote this week's blog post with one of our PhD students, Christina Baker. Christina is conducting a study of school nurses to find out the most commonly reported barriers to COVID-19 vaccination for children. Many of them are linked to Intuitive-mind thinking rather than to Narrative logic: https://sites.google.com/view/two-minds/blog
Relevant answer
Answer
Every novel vaccine remaine a source of excitment until results appear ,for Covid_19 the main barrier is unexpected results which is nothing in comparing with the disease threats and death, circumstance of hospitalization.Finally the excitment diminished in front of results obtained
  • asked a question related to Logic
Question
3 answers
I would like to ask what is the best (the most simple) way to implement a deductive system based on non-monotonic logic (example: can we do this using PROLOG)?
Thanks in advance!
Relevant answer
Answer
Sequent Calculus might work for you.
  • asked a question related to Logic
Question
1 answer
I am
Relevant answer
Answer
If there is enough train data for all fault type cases to clamp to input and output, there will not be a need for a fixed threshold.
  • asked a question related to Logic
Question
4 answers
Is there any theory or logic to fix 2nd order construct as formative and reflective. As at first order discriminant validity has established so it can be clue that the 2nd order is formative if not what can be idea to set 2nd order reflective or formative. I saw so many scale like Organisation commitement having three dimension ( AC, CC, NC). Are these cause or reflection of OC at higher order ? Further if we want calculte utility then expectancy and value are these cause utility or reflection of utility at 2nd order ?
Relevant answer
Answer
Dear Charu ... CTA is not got proper autheticity...further for CTA minimum four indicator are required.....
  • asked a question related to Logic
Question
4 answers
How can I find a list of open problems in Homotopy Type Theory and Univalent Foundations ?
Relevant answer
Answer
A list of open problrems for hpmotopy type theory is presented in HISTORIC.
  • asked a question related to Logic
Question
1 answer
Can someone tell me your experience with this book or a related one "Introduction to Logic: and to the Methodology of Deductive Sciences" I will appreciate...
Relevant answer
Answer
Juan Antonio Baez-Verdin Introduction to Logic and Deductive Science Methodology In the first half of this traditional undergraduate study, the deductive method is examined, and in the second part, applications of logic and methodology in the construction of mathematical theories are explored. Exercises are sprinkled throughout.
  • asked a question related to Logic
Question
3 answers
How can I find a list of open problems in Homotopy Type Theory and Univalent Foundations ?
Relevant answer
Answer
Regards,
Germán Benitez Monsalve
  • asked a question related to Logic
Question
6 answers
Could you suggest to me some essays that deal, wether it be in interpretive, critical, or merely informative manner, with Jacques Lacan's paper of Logical Time and the Assertion of Anticipated Certainty?
Relevant answer
Answer
Cenk Tan I have already recommended them as they strike me as truly useful, especially the first source which, after the first look at its corpus, seems to be well elaborated on the matter.
  • asked a question related to Logic
Question
2 answers
Need to know logic behind it
Relevant answer
Answer
Decomposition is a general approach to solving a problem by breaking it up into smaller ones and solving each of the smaller ones separately, either in parallel or sequentially. The logic behind decomposition is to solve iteratively the large-scale complex mixed-integer problems, which can not be efficiently solved altogether by the commercial solvers.
It separates the problem into two related problems (a master problem and subproblem) by classifying the constraints of the problem into “easy constraints” and “difficult constraints”. The master problem typically contains discrete variables with “easy constraints” and provides an upper bound for the original problem. The “difficult constraints” are moved to the subproblem, which gives a lower bound and generates Benders cuts for the master problem. These two problems are solved iteratively in a delayed-constraint-generation fashion and finally, converge to a globally optimal solution.
On the sideline, decomposition has some benefits, especially when there is some coupling or interaction between subproblems and with the master problem such as in the case of market and trading problems. This interaction can be effectively modeled using this decomposition in many problems.
  • asked a question related to Logic
Question
11 answers
Metamathematics -- the fundamental logical paradigm of maths -- was never fully defined by Hilbert (nor anyone else), causing severe yet commonly ignored consequences for all branches of maths and maths theory ever since. So, it is difficult to find relevant papers and anybody interested in investigating or discussing the subject.
Relevant answer
Answer
What kind of statements are metamathematics: logical, mathematical, logical-mathematical, philosophical? In the case of Hilbert's claims, they were aimed at finding a secure foundation for mathematical theories, and he thought that this could be achieved if the consistency of arithmetic was established, but that is only an assumption as to "foundation" which, furthermore, results from assuming that logic (because consistency is a formal property studied by theories such as logic) is a kind of guarantee for human knowledge, but logical theories and have their own problems (such as the inapplicability of the principle from the excluded middle to quantum mechanics) and I believe that if the 20th century has taught us a lesson in terms of analytical philosophy and the "foundation of mathematics" it is that it is not even possible to find a deductive language -or an axiomatic system- that is perfect or basic, nor can logic be an unquestionable starting point.
  • asked a question related to Logic
Question
3 answers
At beginning era of digital design, the binary logic is used in all industrial applications. The binary logic uses 2 states i.e. 0’s and 1’s to represent each state. Since the binary logic uses 2 states, it requires more number of bits to represent a number.
The ternary logic uses less number of bits to represent a number compare to binary logic. It also reduces the area and also the power of the circuit. The ternary logic uses 3 states i.e. 0,1,2 where logic 0 is considered as low state and logic 1 is considered as middle state and logic 2 is considered as high state.
The quaternary logic uses 4 states i.e. 0,1,2,3 states. The quaternary logic further reduces number of bits and also enhances the power compare to binary and ternary logic.
Relevant answer
Answer
It is so as the logic value gets higher the number of interconnects become smaller. So, quaternary is more effective than the ternary. The other issue is that the quaternary logic can be easily converted into binary logic.
But as the value of the logic increases the power consumption may get larger as the power is proportional to the amplitude square.
Best wishes
  • asked a question related to Logic
Question
1 answer
Re: ARTICLE: "Should Type Theory replace Set Theory as the Foundation of Mathematics?" BY
Thorsten Altenkirch
Type Theory is indicated (by the author) to be a sometimes better alternative and a sometimes-replacement for regular set theory AND thus a sometimes better replacement for the logical foundations for math (and Science). It seems to allow turning what is qualitative and not amenable to regular set theory into things that can be the clear particular objects of logical reasoning. Is this the case? (<-- REALLY, I am asking you.)
It is very rarely, if ever, I have addressed anything that I did not have a good understanding of; BUT, here is the exception (and a BIG one). (I HAVE VERY, VERY little understanding of this Article -- even from the most crude qualitative standpoint. You would say I should have researched this more, but it in not my bailiwick , only more confusion, on my part would likely occur, "shedding no light". My sincere apologies. ANYHOW:
:
If indeed things are as the author, Thorsten Altenkirch, says: it seems different things (other than those related to standard propositions in regular set theory) could widen the use of set theory itself yet retaining (including) all of regular set theory (with all of its virtues, as needed). BUT, in addition it is indicated it could be applied to areas (PERHAPS, like biological and behavior science) where present set theory (and the math founded on it) cannot now be applied.
"[ The ] type theoretic axiom of choice hardly corresponds to the axiom of choice as it is used in set theory. Indeed, it is not an axiom but just a derivable fact."
More Quoting of the author: "Mathematicians would normally avoid non-structural properties, because they entail that results are may not be transferable between different representations of the same concept. However, frequently non-structural properties are exploited to prove structural properties and then it is not clear whether the result is transferable." .... "And because we cannot talk about elements in isolation it is not possible to even state non-structural properties of the natural numbers. Indeed, we cannot distinguish different representations, for example using binary numbers instead." ... "we can actually play the same trick as in set theory and define our number classes as subsets of the largest number class we want to consider and we have indeed the subset relations we may expect. ... Hence Type Theory allows us to do basically the same things as set theory" ... as far as numbers are concerned (modulo the question of constructivity) but in a more disciplined fashion limiting the statements we can express and prove to purely structural ones."
"we cannot talk about elements in isolation. This means that we cannot observe intensional properties of our constructions. This already applies to Intensional Type Theory, so for example we cannot observe any difference between two functions which are pointwise equal." ...
"...Hence in ITT (regular set theory) while we cannot distinguish extensionally equal functions we do not identify them either. This seems to be a rather inconvenient incomplete- ness of ITT, [ (common set theory)] which is overcome by Type Theory (HoTT)"
"[It] reflects mathematical practice to view isomorphic structures as equal. However, this is certainly not supported by set theory which can distinguish isomorphic structures. Yes, indeed all structural properties are preserved but what exactly are those. In HoTT all properties are structural, hence the problem disappears. ..."
"While not all developments can be done constructively it is worthwhile to know the difference and the difference shouldn’t be relegated to prose but should be a mathematical statement." [AND}: ...
"Mathematicians think and they often implicitly assume that isomorphic representations are interchangeable, which at closer inspection isn’t correct when working in set theory. Modern Type Theory goes one step further by stating that isomorphic representations are actually equal, indeed because they are always interchangeable."...
..."The two main features that distinguish set theory and type theory: con- structive reasoning and univalence are not independent of each other. Indeed by being more explicit about choices we have made we can frequently avoid using the axiom of choice which is used to resurrect choices hidden in a proposition. Replacing propositions by types shows that that the axiom of choice in many cases is only needed because conventional logic limits us to think about propositions when we should have used more general types."
Relevant answer
Answer
The answer is simply no. Additionally, considering "realist (platonic)" and "non-realist (non-platonic)" doesn't actually help with the answer I am going to provide, and the article also begs the question. It's like asking why you like music, is it because it sounds good, or is it because it makes you feel good? Well, that depends on what you mean! Equally, asking a working mathematician about the independence of math, or the construction of math will get you very confused looks. They way one treats math, is ever which is the most convenient, or the most sensible to the person. As such, the article in question does not particularly respect nor delineate the historical and functional differences between these two foundations of mathematics very well. Mathematics is a very broad, messy, overlapping subject. In fact, most of the math I regularly use does not really involve calculations, or functions per se. But, as the article is a pre-print, I assume it simply represents a scribbling of his thoughts.
In order to elucidate my answer better, some background in the cartography of mathematics is needed. There are many different universes (formal distinct foundations of mathematics as unique fields) of mathematics that have their own level of reasoning, and focuses. To name a few, category theory, abstract group theory, analysis, proof theory, many-valued logic, and the list just keeps going. All of which are employed at different levels to ascertain certain properties of math, or even to articulate certain questions. For instance, if one wants to study the different universes of mathematics, category theory is generally involved, and the object considered is called a topos. Or if one wishes to study how numbers work, one can employ number theory to study them as unique things, or you can employ analysis and study them as functions, or you can study them with group theory and consider them as action as well. In this view, no field of mathematics has a primacy over other mathematics, only advantages to the inquiries at hand.
Here is a simple question that I think illustrates the point I am making: is two an element of four, or not? That is to ask, in the construction of numbers, are they considered logically unique (aka type theory), or as informal primitives so that numbers are just simply numbers (set theory)? It is in fact this very question that helps separate type theory and set theory. This question, is akin to asking is meaning found in words or what the words represent? However both are true to a certain degree, and from different perspectives. If we are partial to the former, we are essentially asking, does the construction of words form the meaning they express? Yes, but only if we consider meaning as inherent to language alone (intensional). That is language makes meaning, not the world outside of our minds. If we are partial to the latter however, then words denote things, they are analogs to events, and point to common descriptions that we see (extensional). In the same manner, type theory considers numbers as things in themselves, to say "there are two dogs" is to say two dogs. Because the number two is different then dogs. Equally, computer scientists often employ type theory because it logically constructs things, whereas, mathematicians like set theory because its very good at describing things, and there relationships. It would be very burdensome for a mathematician if we had to logically construct everything from the bottom up. Instead of saying, let us consider a sequence of integers. The computer scientist would have to define every part of that sentence.
I hope this helps clarify the question.
  • asked a question related to Logic
Question
8 answers
Sudoku is not just pleasure/pain, a morning wake-up or a diversion, but also an exercise in logic. When people make the same mistake in logic every time, what can the type of error reveal about the solver's brain?
Relevant answer
Answer
Happy Puzzle Day, everyone!
  • asked a question related to Logic
Question
1 answer
I have built a task scheduling algorithm in MATLAB. I am currently working on task scheduling optimization in fog computing. I have created the physical components, logical components(application modules) and also mapped the modules to the physical devices in iFogSim. But how to actually run these modules and call the scheduling algorithm from MATLAB to iFogSim.
Any help is really appreciated.
Relevant answer
  • asked a question related to Logic
Question
4 answers
As per the attachment: There are three sets of students. Each set is evaluated by individual judges. Hence the marks are very varied in the three sets. How to give rational marks to all? I wish to make the highest and lowest marks of all three sets equal, and the other marks follow. Is this possible?
Relevant answer
Answer
There is no definitive way to do what you want to do because the effect of the raters is conflated with the effect of the students. That is, Rater B gave low ratings, but you don't know if that's because Rater B always rates students low or because the students that Rater B happened to get were poor students. Ideally, you would have each student rated by at least two raters, but preferably have each student rated by all of the raters. ... One thing you could do assume that the highest rated student by each rater is really good and the lowest by each rater is really not good. You can re-code the ratings for each rater so that the highest is 10 and the lowest is 1. This may not be fair to all students, tho. Or you could modify this, for example, making sure no student gets a lower score than they did originally.
You can run the following code at this website, without installing R: https://rdrr.io/snippets/ . The code beginning with # are comments, and don't need to be run.
library(rcompanion)
A = c(5, 4, 9, 7, 8, 10)
blom(A, method="scale", min=1, max=10)
### 2.5 1.0 8.5 5.5 7.0 10.0
B = c(3, 5, 6, 5, 4, 5)
blom(B, method="scale", min=1, max=10)
### 1 7 10 7 4 7
C = c(2, 9, 8, 7, 6, 8)
blom(C, method="scale", min=1, max=10)
### 1.000000 10.000000 8.714286 7.428571 6.142857 8.714286
  • asked a question related to Logic
Question
3 answers
Hello
I want to run energy minimization for my protein structure with Amber.
I decided to run more steps of minimization for the whole system compared to the first round of minimization that solute is frozen. I don`t know it`s logical or not.
Do you have any idea about this?
Regards
Relevant answer
Answer
To minimize the energy you should walk 10,000 steps daily.
  • asked a question related to Logic
Question
37 answers
The original meaning of the word "theory" comes close to "view", or even "world view". As such it has already been used by the ancient Greek philosophers, e.g. Aristoteles or Plato. Over the centuries, its meaning has become more and more precise, culminating in a well-defined logical notion of the correspondence between a part of the (outer) real world and the (inner) symbolic world we use to think about or describe it.
In more popular parlance, Wikipedia summarizes it in the statement: "A theory is a rational type of abstract thinking about a phenomenon or the results of such thinking." *) Of course, what is meant with "phenomenon" (also an ancient Greek word) is typically left unspecified: it may be a very specific class of objects or events, or it may be something as big as our universe (as in "cosmological theory").
Over the years, I have observed a gradual inflation of the technical term "theory" as defined and used in scientific methodology. The (dualistic) notion of a correspondence between the real world on the one hand and the media we use to reflect about the latter (thought, language, ...) on the other hand seems to have been lost during the rise of empirical research with its strong emphasis on "phenomena" instead of "thoughts".
The result is that the technical term "theory" appears to have also lost its well-defined meaning of a bridge between our outer world "as we observe it" and our inner world "as we reason about it". For instance:
  • In a recent paper (2021), the author (a well-known expert in a subfield of social science) promises to offer a theory (sic!) of a particular "phenomenon" in his subfield. As I am also much interested in the kind of phenomena he is doing research about, I of course hoped to find - at least - a worked-out theoretical model of those phenomena.
  • Far out! Besides a simple flow-chart of (some of) the processes involved, what he presented was a large collection of more or less confirmed "empirical facts" together with simple "interpretations" (mostly re-wordings) and pointers to possible or plausible relationships.
  • I didn't find any sign of the hallmarks of a good theory: a worked-out theoretical model of those phenomena, on the basis of which I (or someone else) could reason about those phenomena, look for inconsistencies between assumptions and facts, derive crucial hypothesis to be tested, etc.: !
My questions to you:
  • What are your experiences with this type of inflated use of the word "theory" in scientific research?
  • Do you believe that there is a difference in this respect between social sciences and natural sciences?
  • How can we bring the "empirical approach" and the "theoretical approach" together, again?
________________________________________
Relevant answer
Answer
Dear Paul Hubert Vossen My area of ​​work is the philosophy of science, but I am a sociologist, so I am familiar with arguments from both disciplinary fields. It is common in the social sciences to use the word "phenomenon" to refer to a social, public and objective fact (it is a common way of referring to a social fact, not necessarily strange) and not to anything mental.
That is, unlike philosophy, where "phenomenon" is often used to refer to something given or that occurs in consciousness, with more or less Kantian meanings, social scientists use the word to refer to a characteristic of social reality, which they consider to be objective and self-existent (independently of any human mind); that is to say, "phenomenon" is not a word that replaces "thought", but rather a "social fact".
On the other hand, while I agree that in the social sciences there are vagueness, imprecision, neologisms and inconsistencies, it seems that in their claim for a model is the idea that the theories of the social sciences should be similar to those of the natural sciences. This position has been called "naturalism" and together with the thesis that social and natural sciences must use the same methods (called "methodological monism") are part of a long and deep epistemological debate about the demarcation between the two types of scientific disciplines and about the scientific status of the social sciences.
In the social sciences, too, the word theory is used as a synonym for hypothesis or hypothesis accepted as knowledge, about a fact or a type of fact, but although this does not occur in physics, it also happens in the biological sciences: for example, theories about why the dinosaurs or the Mayan civilization became extinct, or why elephants periodically approach their cemeteries, which are very far from the conceptions that consider them as interpreted calculations.
  • asked a question related to Logic
Question
10 answers
Russell’s paradox is the most famous of the logical or set-theoretical paradoxes. Also known as the Russell-Zermelo paradox, the paradox arises within naïve set theory by considering the set of all sets that are not members of themselves. Such a set appears to be a member of itself if and only if it is not a member of itself. Hence the paradox
Relevant answer
Answer
The question about whether there is a paradox can be ambiguous if we do not take into account that we are talking about theories, about texts. As such, said "paradox" is a genuine contradiction -sometimes also called antinomy- that arises from the ontological problem of assuming -within a certain version of set theory-, that there are sets that contain themselves as elements and others not. Russell's solution is, in my opinion, an ad hoc "solution", because it avoids the contradiction at the price of exchanging the ontology for one that cannot be independently justified.
I think the problem with any set theory - a genuine philosophical and not a logical problem - is that some authors assume that "set" is an undefined term, but others identify it with "collection" (but this cannot explain the existence of empty sets without changing the meaning of the word "collection").
I understand that the problem behind the paradox is one already addressed by the nominalist tradition and by Wittgenstein in the Tractatus, respect to the presumed relation of identity of a thing with itself : according to the theory there are sets that include themselves as elements but, does it make sense to say that a thing is related to itself, or for there to be a relation there must be at least two relata or terms of the relation?
"Roughly speaking, to say of two things that they are identical is nonsense, and to say of one thing that it is identical with itself is to say nothing at all" (Wittgenstein, 1921, 5.5323)
  • asked a question related to Logic
Question
5 answers
I have been working with ontologies (RDF/OWL) a lot of time, using mostly them as an engineer, because they permitted SPARQL and rules essencially.
It's only recently, this year, that I started to really pay attention to the theoretical grounding of OWL. This lead me to dive into the zoo of many Description Logic and their desirable or undesirable properties.
I think there is some serious issues in the multiplication of work on DL, which are almost never considered under the perspective of actual usefulness, of their ability to describe the specific structures that are at core of many domains (law, clinical science, computer science...).
Quite some of the theoretical work in DL and logic seems to formally study and prove property about language (DL are language) that nobody is speaking or will ever speak. This is quite salient when considering the very little number of working reasoners (which are covering only a small fragment of DL described formally).
It seems to me that, after the incredibly fecund periods that started with Frege, Russel, Tarski, Hilbert, Godel, Carnap... The theoretical work was somewhat considered to be done and less attention was focused on formal language for Domain Description.
On the other hand, questions related to problem solving (planner) became treated only as SAT problem needing optimisation. With almost no reference to first order logic and thus having poor link with DL.
Finally, on the third hand, modal logic, which has clearly deep link with first order logic (the square operator/diamond operator and the existential quantifier/universal quantifier in particular), has been abandoned by computer scientist and become, more or less explicitly, a field of philosophy.
I think this state of affairs isn't satisfying and that there is a work of conceptual clarification and of revision of the foundation of mathematics that would integrate these development.
To that end, something that does seem absolutely essential is to give each other an easy access to reasoners. By easy access, I don't mean a program written in some obscure language whose source must be compiled on a specific linux.
I mean an access to the reasoning service through a (loosely standardized) REST API. These service should be accompanied with websites giving relevant example of using the reaoner, with an "online playground".
I think this could be done for classic DL such as EL or SHOIQ but also for modal logic in it's various kind (epistemic, deontic), and that could also could be done for planification based on First Order Logic.
I'm currently cogitating about the engineering question that would raise from such a logical zoo, and about a grammar that would be usable for every reasoning problem description involving this kind of logic.
If you are interested by the question and/or have skills in modern full stack architecture and Dockerisation, I'd be interested to have your opinion about the current situtation and the feasability of such a logic zoo, which would be an useful tool for clarifying the domain.
Relevant answer
Answer
I suggest that you make contact with the automated theorem proving communit. One starting point is the Wikipedia entry
another is the long running series of Conferences on Automated DEduction (CADE)
  • asked a question related to Logic
Question
6 answers
I have a paper that recommended minor revision, and the reviewers ask me to provide the control variable for my model. But I don’t have the model's data anymore, so I cant add control variables to my result. What can I respond to this issues reviewers comments? Any idea? Any good logical response?
Relevant answer
Answer
I don't know what kind of data/model you are having but still you can justify your results that adding firm/industry-specific characteristics would not change the results significantly. Find some studies which used this logic and cite them as well to support your argument.
  • asked a question related to Logic
Question
68 answers
I am interested in the incompleteness of boolean logic. Logic consistently constructed from 1 and 0 is always incomplete with respect to infinity but it is also incomplete with respect to i. If I plot y against x then there is nowhere to plot √-1 so I must add a third dimension to my graph. We can invent as many dimensions as we like but I am not aware of any well-defined value which cannot be plotted in three dimensions. Is there one? If so how is it defined in terms of 0 and 1?
Relevant answer
Answer
I didn't mention a publication that will provide some hints or add more confusion... The file is added.
  • asked a question related to Logic
Question
4 answers
Hello.
I've been recently doing a research on the Argument Based Validation model of Kane (1990, 2011) and Bachman and Palmer (2010).
In one of Kane's articles (2004, p.145) it is stated that argument based validation more than scientific theories uses informal logic and practical reasoning. My question is that what is the advantage of this approach? Also, I will be thankful if you provide examples about it since I cannot understand how it is really practiced.
Thank you
Relevant answer
Прочтите взгляды аль-Газали и Ибн Сины на этапы логического мышления.
  • asked a question related to Logic
Question
2 answers
Logical explanation required with publications.
Relevant answer
Answer
Thank you Thomas W Kelsey for your kind information.
  • asked a question related to Logic
Question
3 answers
Logical suggestions required with publications.
Relevant answer
Answer
Please refer to the article below. It will be helpful.
Best.
  • asked a question related to Logic
Question
2 answers
Statistically absolute means all positive values but here I am confused whether absolute discretionary accruals are only the positive values of DA. is there any difference between ADA and DA? if no then what's the logic of using this keyword absolute with discretionary accruals?
Relevant answer
Answer
Absolute value just means that you are ignoring the sign (minus or plus) in front of a numerical value; i.e. you are just considering how far your value is from zero. I don't know what implications this might have for accounting practices but perhaps this article can help you:
  • asked a question related to Logic
Question
18 answers
Please dont provide formulae simply. Provide the logic of origination of differential and integral calculus.
Relevant answer
Answer
I have a feeling that a century ago people were subjected to fear of death by the world wars. Most inventions are due to military funding. Somehow people were accustomed to deep thinking and conscious speech which currently people lack.
  • asked a question related to Logic
Question
14 answers
I "heard" people saying that Feynman had the opinion that nobody understands QM. But what does that mean?
Does it mean that we are not smart enough? Or, alternatively, does it mean that QM disobeys the rules of the human logic?
Relevant answer
Answer
So QM is a law of the nature then?
  • asked a question related to Logic
Question
3 answers
Profound biochemical analysis into the structure of a certain protein can explain and even predict the mechanistic properties of said protein function and even of many more proteins that have a similar fold. What other biological molecule type (not protein) is an example of that same logic?
I was thinking that maybe phospholipids' structure or phosphorylation state can say something about its function, does it sound right?
I would appreciate it if you have another example
Relevant answer
Answer
For small molecules, the chemical reactivity can be predicted by what functions groups are present, and how they are connected (covalent structure). This is not sufficient for proteins - they are composed of only 20 different amino acids, therefore they all contain the same functional groups - you have to know the 3dimensional arrangement of the peptide chain bringing the correct functional groups into a very specific arrangement to explain their activity, and unfolding (denaturation) of the protein leads to loss of function, although this does not alter the covalent structure. The next most similar system are ribozymes, where the sequence of nucleotides determines the folding of the chain, which in turn produces the exact 3D structure required for function. The most prominent ribozyme of course is the ribosome, where not only the tertiary, but also the quaternary structure is essential for proper function and regulation.
  • asked a question related to Logic
Question
5 answers
I have in mind, that Logic is mainly about thinking, abstract thinking, particularly reasoning. Reasoning is a process, structured by steps, when one conclusion usually is based on a previous one, and at the same time it can be the base, the foundation of further conclusions. Despite the mostly intuitive character of the algorithm as a concept (even not taking into account Turing and Markov theories/machines), it has a step by step structure, and they are connected, even one would say that logically connected (when they are correct algorithms). The different is, of course, the formal character of the logical proof.
Relevant answer
Answer
Teachable neural networks are currently the main direction in the development of artificial intelligence (AI). These systems belong to the category of inductive anthropomorphic, which is associated with their well-known advantages, disadvantages and areas of application. It is generally accepted that anthropomorphism is the exclusive factor in the advantage of AI, however, there are actual areas of AI application, in which the use of deductive systems is more effective, which use less data and computational resources, but are capable of learning. Examples of such systems: knowledge generators, expert systems, active marketing systems, and many others.
The basis of AI deductive imitation systems are extremely general models and algorithms that imitate the objects (objects) under consideration. The generality limit is determined both by the amount of knowledge about the object and by the calculated capabilities. General models (metamodels) are usually formed in the form of systems of equations that link together the variables that describe the object under consideration, as well as parameters that determine the form of partial equations formed when transforming the equations of the general model. The solution of such systems of equations is carried out by methods of global optimization: stochastic (Monte Carlo methods) or heuristic (evolutionary algorithms and others). Simulation AI presupposes the solution of two basic problems: inverse (implementing the system training options) - the formation of partial equations (models) according to the values ​​of certain variable characteristics of objects; direct - definitions of values ​​of variables for particular configurations of models. The use of simulation metamodels in combination with the randomization of variables and parameters provides high flexibility and adaptability of intelligent systems.
  • asked a question related to Logic
Question
7 answers
Hello,
I am searching for auto tools that can be used to calculate the number of the real-consumed cycle or number of logical and mathematical operations for a code of thousands of lines?
Regards
Relevant answer
Answer
As long as you do not need a cycle accurate measurement, you can use profiling tools (such as Intel VTune, GPROF, or Valgrind as also suggested earlier). With VTune you find the number of cycles per function/class, number of operations, hotspots, memory usage, and several other factors.
  • asked a question related to Logic
Question
5 answers
What would be a good journal to submit the paper above ?
Relevant answer
Answer
History and Philosophy of Logic
  • asked a question related to Logic
Question
10 answers
I have found 5 point Likert scale items for my mediator and 3 of my IVs, while I am unable to find 5-point Likert scale items for my DV and one IV. Can I change the scale from 7-point to 5-point if I am unable to find the same scale for my DV and one of my IVs?
Is there any particular technique to convert the scale from 7-point to 5-point? Does it require any rationale/logic?
Relevant answer
Answer
Mosharop Hossian Thank you for your response, I will look into it in detail.
  • asked a question related to Logic
Question
5 answers
I have designed a digital circuit using an all-optical logic gate using R-Soft CAD layout and the output achieved for logic 1 is only 6% of the maximum normalized power. i need to improve it.
is there any possibility in R-Soft CAD layout to design an amplifier.
Can anyone say how to design amplifier?
Relevant answer
Answer
if you have only 6% of input signal in output port, it means lots of your intensity was absorbed by your structure or it leaked to the other wave-guides. as a solution you can doubled the power of input signals or by embedding some point defects guide the signals to the output. using ring resonators or nonlinear effects can be useful,too.
  • asked a question related to Logic
Question
3 answers
I've built a multi-valued logic model containing 64 nodes. Do someone here know a tool able to run 1,000 initial states for this simulation in synchronous update?
Relevant answer
Answer
This is what those big HPC systems ( are designed to do: hundreds or thousands of multi-core nodes connected by InfiniBand, capable of memory to memory transfers of halo or ghost cells on the edges of sub-grids, executing in global timesteps and occasionally checkpointing the grid for fault-tolerance and recovery. Dynamic mesh grids are required for the most complex problems (probably not for yours.) The top ten HPC systems are described here: https://www.datacenterdynamics.com/en/news/fugaku-retains-top-spot-worlds-most-powerful-supercomputer-top500-ranking/#:~:text=Fugaku%20is%20still%20the%20world's,the%20project%20began%20in%201993.
  • asked a question related to Logic
Question
32 answers
I have always wondered, what Trinity is?
but got no good answers, today a thought snapped into my mind, that why there exists the logical problem of trinity? and why there is no such thing called the Logical Problem of Unity of God?
Lets see, what specific defenders reply back
Relevant answer
Answer
If you take the relation of identity literally there appears to be a contradiction: three individuals who are not identical (à la 1) are identical (à la 2)
  1. (Father ≠ Son) & (Father ≠ Holy Spirit) & (Son ≠ Holy Spirit)
  2. (Father = God & (Son = God) & (Holy Spirit = God)
  • asked a question related to Logic
Question
1 answer
Dear,
I have a dataset of observed temperature data for a small region that I want to use in order to bias correct the RCM data provided by the website "esgf-data.dkrz".
However, in the experiment tab I found "evaluation / historical/ RCP4.5/RCP8.5”.
  • I know this RCP are for future scenarios (not bias corrected) but what are the two others used for?
  • In addition, I don't know which software to use to correct the data?
Logically I should correct the "historical" data experiment based on my observed one then do a projection for the RCP4.5.
In case I have no overserved temperature data:
  • Is there any available sources providing reliable temperature historical data?
Any guidance or explanation is appreciated.
Regards
Relevant answer
Answer
For bias correction of future modeled data (different RCP's) I suggest to use Qmap library in R: " https://cran.r-project.org/web/packages/qmap/qmap.pdf ".
There are examples provided inside the document (above) that may help you.
There are different methods for bias corrections, such as the methods that are compared by the works such as :doi:10.5194/hess-16-3383-2012) and .
  • asked a question related to Logic
Question
5 answers
I want to simulate the heat pump (air source) with unite storage thermal Tank but I get the following msg:
''350 : Unable to open the file associated with an
ASSIGNED logical unit number.
Please check the ASSIGN statement and make sure
that the file exists at the specified location''
and I want to,know also how can I creat a file
external if necessary
thank you
Relevant answer
Answer
Dear Munsannif
For the problem of the external file, I have a problem in the data file because the component I have is empty, so I created my own data file based on the documentation available in the installation file TRNSYS. After that, I verified the right data from an X manufactring for my sizing calculation.
After, I have import the new data file into Heat pump component.
  • asked a question related to Logic
Question
4 answers
The idea is introducing student to the topic from a more general subject, and introduce the different structures of, say, Propositional Logic as a formal system, and from them deduce the Boolean Algebra (while introducing the interpretation of formulas), and further logical laws.
Relevant answer
Answer
I believe it is. I have taught logic myself to computer science students for several years.
  • asked a question related to Logic
Question
1 answer
Hi guys, I want to test a wind turbine performance with different rotation speeds and a fixed wind velocity. But when I used a TSR of 6,7,8,9 I found high torque coefficient so multiplied by high rotation speed, an high power coefficient is resulted (0.78?!). What must I do to have logical results?. Thank you.
Relevant answer
Answer
Can you give us your detailed calculation steps?
  • asked a question related to Logic
Question
8 answers
I am working on some bactericide coatings which contain various amounts of copper. The goal is to observe the bacteria-killing efficiency of the coatings with copper content and exposure time for different gram-negative and gram-positive bacteria.
For some bacteria, we see a logical behavior but for some others, the fluctuation of the data is very large and no clear trend is observed (with copper content or exposure time). Do you think the different roughness of the coatings could influence the results a great deal? I have read that roughness definitely can influence the antibacterial properties of the surface but the roughness Ra of my samples differs just in the few nanometer range. 8, 10, 15, and so on.
I appreciate any helpful answer in advance.
Relevant answer
Answer
a potential synergy between bacterial growth and surface roughness could be observed if the peak-to-peak distance of the surface is comparable with the size of bacteria or cellule that need to locate and grow in the surface. I recommend to check accurately the topography with 3D profilometry or AFM and benchmark with typical size of bacteria you are interested in. This effect is used beneficially to tailor the surface roughness of metal prosthesis in order to fasten the growth of osteo cellulae. BR
  • asked a question related to Logic
Question
16 answers
Conditions to be accepted as an editor require the ability to generate new concepts and a certain level of experience measured by the H-level. Journals are also expected to arranges for subject matter experts (SMEs) to review their articles. In light of the collaborative processes that are crossing several fields, is the H-level still the best measure of an editor? Is the SME still the best peer-reviewer when we know that there is a limit on the burden of SMEs to review? Arguably, there are some topics that require specific knowledge (e.g., molecular biology, neurosurgery). Is the current trend in multi-collaborative projects making it better to focus on the editor’s ability to read the paper for understanding, adherence to protocols, structure, content, and the logical progression of supported thoughts leading to findings, recommendations, and conclusions?
Ultimately, authors have been blind to their reviewers. But if the present course is changing to open review, then perhaps the foundation and credentials of an author/editor-peer review process may have to change, too. I welcome your thoughts on the necessity of Author/Editor same subject matter expertise.
Relevant answer
Answer
Yes, SMEs are still the best peer-reviewer; as the peer review process involves subjecting the author's scholarly work and research to the scrutiny of other experts in the same field to check its validity and evaluate its suitability for publication.
Reference:
  • asked a question related to Logic
Question
4 answers
Hello everyone,
I hope you are all well.
I am currently working on the simulation of flexible MOFs for carbon capture using aspen adsorption. I am trying to insert the LJMY-Langmuir isotherm using the user-model approach, as shown in the image below. However, the code could not compile because the ERF function is not available in the flowsheet environment. I would be grateful if someone would help regarding this issue or guide me info a better approarch. Also, I am not sure if the code is logically correct.
Regards,
Yarub,
Relevant answer
Answer
Thank you for your reply. I actaully followed this article when I wrote the isotherm.
  • asked a question related to Logic
Question
2 answers
- Breaker interlock scheme
- Under-frequency relay (81) scheme
- Programmable logic controller-based load shedding
- Fast intelligent load shedding (ILS)
Relevant answer
Answer
According to me, micro processor based load shedding control are more easy and reliable to use, could refer my papers.......
  • asked a question related to Logic
Question
6 answers
Asking a circular question about asking questions and providing answers as well as further questions. The process could go on ad finitum meaning that the process itself could circumvent (and not circumnavigate) the discovery of genuine knowledge that we need more than ever since it is now about apocalyptic global warming.
Relevant answer
Answer
I think that not all questions are of this type.. Yes, cluster questions that revolve around a base point can achieve this course.. The more the teacher is able to ask questions is an important and necessary thing
  • asked a question related to Logic
Question
11 answers
Hi,
I would like to know how can I calculate the shielding effectiveness SE and part it in the different components (SER and SEA) through the S-parameters (S11 and S21). I am using a VNA for obtaining these 2 s-parameters (in dB) and now I would like to calculate the shielding effectiveness (SE).
I came across with some equations that calculates R, T and A:
R = S112; T = S212; A = 1-R-T; SER = -10log10(1-R); SEA=-10log10(1-A); SE = SER+SEA;
However, I am not obtaining logic results by doing this and I don't know if these equations are correct.
Thank you so much for your attention and help!
Ana
Relevant answer
Answer
I did. The equations are correct, I was just using the wrong value of S11, that's why I wasn't obtaining the right answer and the sum of SER+SEA wasn't equal to the total SE.
  • asked a question related to Logic