Science topics: Mathematics

Science topic

# Mathematics - Science topic

Mathematics, Pure and Applied Math

Questions related to Mathematics

In mathematics, many authors working in the area of integer sequence, fibonacci polynomial, perin sequence......

Now what is the current research topics in this subject?.

I request , suggest some research topics which is related to Fibonacci sequence.

I am doing a project on plastic biodegradation by

*G. mellonella*larvae. I am doing a project on plastic biodegradation by*G. mellonella*larvae. I am just getting into this field and I want to know how I can determine the biodegradation. Do I have to use some mathematical formula? Thank you very much.**From Newton's Metaphysics to Einstein's Theology!**

The crisis in modern theoretical physics and cosmology has its root in its use, along with theology as a ruling-class tool, since medieval Europe. The Copernican revolution overthrowing the geocentric cosmology of theology led to unprecedented social and scientific developments in history. But Isaac Newton’s mathematical idealism-based and on-sided theory of universal gravitational attraction, in essence, restored the idealist geocentric cosmology; undermining the Copernican revolution. Albert Einstein’s theories of relativity proposed since the turn of the 20th century reinforced Newtonian mathematical idealism in modern theoretical physics and cosmology, exacerbating the crisis and hampering further progress. Moreover, the recognition of the quantum world - a fundamentally unintuitive new realm of objective reality, which is in conflict with the prevailing causality-based epistemology, requires a rethink of the philosophical foundation of theoretical physics and cosmology in particular and of natural science in general.

If we should calculate it by experimental test on target organism or we should find it mathematically?

co- toxicity factor =(O-E)*100/E

that

O is observed % mortality of combined plant extracts

E is expedcted m% mortality

Bonjour a tous,

je voudrais savoir la relation entre la temperature du verre ( viscosite ) et le radius de la fibre de verre?

merci

- What is the relationship between the scientific understanding of the world and the reality in nature? It may be said that the real world is much richer in terms of structure than the results of the physical and mathematical models that were developed for it. In these models, there is one or more angles of view limited to the natural phenomenon in question, inventing a complete theory, whose results are correct from any angle, may be the dream theory of "Theory of Everything"!

- Is there an unknown form of mathematics that has not yet been found to solve all the problems of a theory of everything?

- Is it necessary to change the conceptual view of physicists on the subject of the theory of everything? So that this new look can include new concepts for problem solving?

- Is there a mathematical system, which has a distinct ability to represent the maximum possible states of the world!?

- Is it possible to imagine that the world is like a carpet that has infinite texture, but its colors and roles are determined by scientists with their theories about the world?! And are we looking for the most realistic pattern and design for the world's carpet?

Three grade teachers response can help researcher measure the students creativity? Age of students is 8-11 years

Our department is offering an elective on Fluid Mechanics in daily life. The course is supposed to be more of a physical treatment of the fluid phenomena rather than mathematical. I would like some recommendations for books on the subject which are light and speak about fluid physics from a physical and application based perspective

hello

How can I determine the tortuosity factor in a porous material with simple mathematical formula?

I know that δ(f(x))=∑δ(x−xi)/f′(xi). What will be the expression if "f" is a function of two variables, i.e. δ(f(x,y))=?

Why prime numbers have a great importance in mathematics for the rest of the numbers ?

Vehicle routing problem is a classical application case of Operation research.

I need to implement the same in electric vehicle routing problem with different constraints.

I want to understand mathematics behind this. The journals available discuss different applications without much talking of mathematics.

Any book/ basic research paper/ PhD/ m.tech thesis will do the needful.

Thanks in advance.

Qm is the ultimate realists utilization of the powerful differential equations, because the integer options and necessities of solutions correspond to nature's quanta.

The same can be said for GR whose differential manifolds, an sdvanced concept or hranch in mathematics, have a realistic implementation in nature compatible motional geodesics.

1 century later,so new such feats have been possible, making one to think if the limit of heuristic mathematical supplementation in powerful ways towards realist results in physics in reached.

**Applying mathematical knowledge in research models:**This question has been in my mind for a long time. Can advance mathematics and applied mathematics solve all the problems in modeling research? Especially the formula derivation in the theoretical model part, can the analysis conclusion be obtained through multiple derivations or other methods? You have also read some mathematics-related publications yourself, and you have to admire the mystery of mathematics.

Which areas in mathematics education is trending currently

As it is not possible to show mathematical expressions here I am attaching link to the question.

Your expertise in determining and comprehending the boundaries of integration within the Delta function's tantalizing grip will be treasured beyond measure.

An attempt to extrapolate reality

Our answer is a competitive YES. However, universities face the laissez-faire of old staff.

This reference must be included:

Gerck, E. “Algorithms for Quantum Computation: Derivatives of Discontinuous Functions.” Mathematics 2023, 11, 68. https://doi.org/10.3390/math1101006, 2023.

announcing quantum computing on a physical basis, deprecating infinitesimals, epsilon-deltas, continuity, limits, mathematical real-numbers, imaginary numbers, and more, making calculus middle-school easy and with the same formulas.

Otherwise, difficulties and obsolescence follows. A hopeless scenario, no argument is possible against facts.

What is your qualified opinion? Must one self-study? A free PDF is currently available at my profile at RG.

Hello,

in an article I found the following sentence in the abstract:

"The results suggested that (1) Time 1 mathematics self-concept had significant effects on Time 2 mathematics school engagement at between-group and within-group levels; and (2) Time 2 mathematics school engagement played a partial mediating role between Time 1 mathematics self-concept and Time 2 mathematics achievement at the within-group level."

What is the meaning of the within-group-level and between-group-level in this context?

The article I am referring to is:

Xia, Z., Yang, F., Praschan, K., & Xu, Q. (2021). The formation and influence mechanism of mathematics self-concept of left-behind children in mainland China. Current Psychology, 40(11), 5567–5586. https://doi.org/10.1007/s12144-019-00495-4

An attempt to extrapolate reality

📷

I'm using target encoding in my work, and I'd like to understand why it's effective from a mathematical point of view.

Intuitively, my understanding is that it allows you to encode the past with the future. I can see why that's effective, and also why it could cause target leakage. However, I can't find a good mathematical explanation for its effectiveness/ issues.

Does anyone know the answer, or have a link to a resource they'd be willing to share?

We assume that the Lagrange multipliers originally introduced in the Boltzmann-Einstein model to derive the Gaussian distribution are just a mathematical trick to compensate for the lack of true definition of probability in unified 4D space.

The derivation of the Boltzmann distribution for the energy distribution of identical but distinguishable classical particles can be obtained in a mathematical approach [1] or equivalently via a statistical approach [2] where the Lagrange multipliers are completely ignored.

1-The Boltzmann factor: a simplified derivation

Rainer Muller

2- Statistical integration, I. Abbas

Hello!

I am curious , can anyone guide me how we can calculate the amount of hydrogen is stored in the metal hydride during the absorption process both in %wt. an in grams and how much energy is released during absorption

Mathematics and theoretical physics are currently searching for answers to this particular question and two other related questions that make up three of the most persistent questions:

i- Do probabilities and statistics belong to physics or mathematics?

ii- Followed by the related question, does nature operate in 3D geometry plus time as an external controller or more specifically, does it operate in the inseparable 4D unit space where time is woven?

iii-Lagrange multipliers: Is it just a classic mathematical trick that we can do without?

We assume the answers to these questions are all interconnected, but how?

Can someone please provide more insight on the mathematical formulation of a frequency constrained UC model, comprising of only synchronous sources, as well as non-synchronous sources. Also what would be the associated MILP code in GAMS?

The error of building a physical world based on the basic feelings of fundamental concepts such as space and time occurred during the creation of Newtonian mechanics by Newton. Of course, this mistake should have been made, so that man would not be deprived of the numerous gifts of technology resulting from this science! But when the world showed another face of itself to man in very small and large scales, this theory along with the error did nothing.

When Newton had those ideas about space and time (of course, maybe he knew and had no choice), he built a mathematical system for his thoughts, differential and integral calculus! Mathematics resulting from his thoughts was a systematic continuation of his thoughts, with the same assumptions about space and time. That mathematics could not show him the right way to know the real world! Because the world was not Newtonian! Today, many pages in modern physics are created based on new assumptions of space and time and other seemingly obvious variables!

Now, why do we think that these pages of current mathematics necessarily lead to the correct knowledge of the world! Can we finally identify the world, as it is, by adopting appropriate and correct assumptions?!

Are there certain methods, for instance T-tests or ANOVAs, for certain ways a survey question is asked?

Actually, I am working on the modeling of path loss between the coordinator and the sensor nodes of a BAN network. My objective is to make a performance comparison between the CM3A model of the

**IEEE 802.15.6**standard and a loss model that I have implemented mathematically.So, according to your respectful experience, how can I implement these two path loss models? Do I have to define both path loss equations under the Wireless Channel model? Or do I create and implement for each path loss model a specific module under Castalia

**(like the wireless channel module)**and after I call it from the omnet.ini file (configuration file) ?You will find attached the two models in a figure.

Thanks in advance

Many people believe that x-t spacetime is separable and that describing x-t as an inseparable unit block is only essential when the speed of the object concerned approaches that of light.

This is the most common error in mathematics as I understand it.

The universe has been expanding since the time of the Big Bang at almost the speed of light and this may be the reason why the results of classical mathematics fail and become less accurate than those of the stochastic B-matrix ( or any other suitable matrix) even in the simplest situations like double integration and triple integration.

The congruent number problem has been a fascinating topic in number theory for centuries, and it continues to inspire research and exploration today. The problem asks whether a given positive integer can be the area of a right-angled triangle with rational sides. While this problem has been extensively studied, it is not yet fully understood, and mathematicians continue to search for new insights and solutions.

In recent years, there has been increasing interest in generalizing the congruent number problem to other mathematical objects. Some examples of such generalizations include the elliptic curve congruent number problem, which asks for the existence of rational points on certain elliptic curves related to congruent numbers, and the theta-congruent number problem as a variant, which considers the possibility of finding fixed-angled triangles with rational sides.

However, it is worth noting that not all generalizations of the congruent number problem are equally fruitful or meaningful. For example, one might consider generalizing the problem to arbitrary objects, but such a generalization would likely be too broad to be useful in practice.

Therefore, the natural question arises: what is the most fruitful and meaningful generalization of the congruent number problem to other mathematical objects? Any ideas are welcome.

here some articles

M. Fujiwara, θ-congruent numbers, in: Number Theory, Eger, 1996, de Gruyter, Berlin, 1998,pp. 235–241.

New generalizations of congruent numbers

Tsubasa Ochiai

DOI:10.1016/j.jnt.2018.05.003

A GENERALIZATION OF THE CONGRUENT NUMBER PROBLEM

LARRY ROLEN

Is the Arabic book about the congruent number problem cited correctly in the references? If anyone has any idea where I can find the Arabic version, it will be helpful. The link to the book is https://www.qdl.qa/العربية/archive/81055/vdc_100025652531.0x000005.

EDIT1:

I will present a family of elliptic curves in the same spirit as the congruent number elliptic curves.

This family exhibits similar patterns as the congruent number elliptic curves, including the property that the integer is still "congruent" if we take its square-free part, and there is evidence for a connection between congruence and positive rank (as seen in the congruent cases of $n=5,6,7$).

**MATHEMATICS VS. CAUSALITY:**

**A SYSTEMIC RECONCILIATION**

Raphael Neelamkavil, Ph.D., Dr. phil.

1. Preface on the Use of Complex Language

2. Prelude on the Pre-Scientific Principle of Causality

3. Mathematical “Continuity and Discreteness” Vs. Causal Continuity

4. Mathematics and Logic within Causal Metaphysics

5. Mathematics, Causality, and Contemporary Philosophical Schools

**1. Preface on the Use of Complex Language**

First of all, a cautious justification is in place about the complexity one may experience in the formulations below: When I publish anything, the readers have the right to ask me constantly for further justifications of my arguments and claims. And if I have the right to anticipate some such possible questions and arguments, I will naturally attempt to be as detailed and systemic as possible in my formulation of each sentence here and now. A sentence is merely a part of the formulated text. After reading each sentence, you may pose me questions, which certainly cannot all be answered well within the sentences or soon after the sentences in question, because justification is a long process.

Hence, my sentences may tend to be systemically complex. A serious reader will not find these arguments getting too complex, because such a person has further unanswered questions. We do not purposely make anything complex. Our characterizations of meanings in mathematics, physics, philosophy, and logic can be complex and prohibitive for some. But would we all accuse these disciplines or the readers if the readers find them all complex and difficult? In that case, I could be excused too. I do not intentionally create a complex state of affairs in these few pages; but there are complexities here too. I express my helplessness in case any one finds these statements complex.

The languages of both science and philosophy tend to be complex and exact. This, nevertheless, should be tolerated provided the purpose is understood and practiced by both the authors and the readers. Ordinary language has its worth and power. If I give a lecture, I do not always use such formal a language as when I write, because I am there to re-clarify.

But the Wittgensteinian obsession with “ordinary” language does not make him use an ordinary language in his own works. Nor does the Fregean phobia about it save him from falling into the same ordinary-language naïveté of choosing concrete and denotative equivalence between terms and their reference-objects without a complex ontology behind them. I attempt to explain the complex ontology behind the notions that I use.

**2. Prelude on the Pre-Scientific Principle of Causality**

Which are the ultimate conditions implied by the notion of existence (To Be), without which conditions implied nothing exists, and without which sort of existents nothing can be discoursed? Anything exists non-vacuously. This implies that existents are inevitably in Extension (having parts, each of which is further extended and not vacuous). The parts will naturally have some contact with a finite number of others. That is, everything is in Change (impacting some other extended existents).

Anything without these two characteristics cannot exist. If not in Change, how can something exist in the state of Extension alone? And if not in Extension, how can something exist in the state of Change alone? Hence, Extension-Change are two fundamental ontological categories of all existence and the only two exhaustive implications of To Be. Any unit of causation with one causal aspect and one effect aspect is termed a process.

These conditions are ultimate in the sense that they are implied by To Be, not as the secondary conditions for anything to fulfil after its existence. Thus, “To Be” is not merely of one specific existent, but of all existents. Hence, Extension-Change are the implications of the To Be of Reality-in-total. Physical entities obey these implications. Hence, they must be the foundations of physics and all other sciences. Theoretical foundations, procedures, and conclusions based on these implications in the sciences and philosophy, I hold, are wise enough.

Extension-Change-wise existence is what we understand as Causality: extended existents and their parts exert impacts on other extended existents. Every part of existents does it. That is, if anything exists, it is in Causation. This is the principle of Universal Causality. In short, Causality is not a matter to be decided in science – whether there is Causality or not in any process under experiment and in all existents is a matter for philosophy to decide, because philosophy tends to study all existents. Science can ask only whether there occurs any specific sort of causation or not, because each science has its own restricted viewpoint of questions and experiments and in some cases also restrictions in the object set.

Thus, statistically mathematical causality is not a decision as to whether there is causation or not in the object set. It is not a different sort of causation, but a measure of the extent of determination of special causes that we have made at a given time.

*Even the allegedly “non-causal” quantum-mechanical constituent processes*are mathematically and statistically circumscribed measuremental concepts from the results of Extended-Changing existents and,*ipso facto*, the realities behind these statistical measurements are in Extension-Change if they are physically existent.Space is the measured shape of Extension; time is that of Change. Therefore, space and time are epistemic categories. How then can statistical causality based only on measuremental data be causality at all, if the causes are all in Extension-Change and if Universal Causality is already the pre-scientific Law under which all other laws appear? No part of an existent is non-extended and non-changing. One unit of cause and effect may be called a process. Every existent and its parts are processual.

And how can a so-called random cause be a cause, except when the randomness is the extent of our measuremental reach of the cause, which already is causal because of its Extension-Change-wise existence? Extension and Change are the very exhaustive meanings of To Be, and hence I call them the highest Categories of metaphysics, physical ontology, physics, and all science. Not merely philosophy but also science must obey these two Categories.

In short, everything existent is causal. Hence, Universal Causality is the highest pre-scientific Law, second conceptually only to Extension-Change and third to Existence / To Be. Natural laws are merely derivative. Since Extension-Change-wise existence is the same as Universal Causality, scientific laws are derived from Universal Causality, and not

*vice versa*.*The relevance of metaphysics / physical ontology for the sciences is clear from the above.***Today the sciences attempt to derive causality from the various scientific laws!**Existents have some Activity and Stability. This is a fully physical fact. These two Categories may be shown to be subservient to Extension-Change and Causality. Pure vacuum (non-existence) is absence of Activity and Stability. Thus, entities, irreducibly, are active-stable processes in Extension-Change. Physical entities / processes possess finite Activity and Stability. Activity and Stability together belong to Extension; and Activity and Stability together belong to Change too.

That is, Stability is neither merely about space nor about Extension. Activity is neither merely about time nor about Change. There is a unique reason for this. There is no absolute stability nor absolute activity in the physical world. Hence, Activity is finite, which is by Extended-Changing processes; and Stability is finite, which is also by Extended-Changing processes. But the tradition still seems to parallelise Stability and Activity with space and time respectively. We consider Activity and Stability as sub-Categories, because they are based on Extension-Change, which together add up to Universal Causality; and each unit of cause and effect is a process.

These are not Categories that belong to merely imaginary counterfactual situations. The Categories of Extension-Change and their sub-formulations are all about existents. There can be counterfactuals that signify cases that appertain existent processes. But separating these cases from some of the useless logical talk as in linguistic-analytically tending logic, philosophy, and philosophy of science is near to impossible.

Today physics and the various sciences do at times something like the said absence of separation of counterfactual cases from actual in that they indulge in particularistically defined terms and procedures, by blindly thinking that counterfactuals can directly represent the physical processes under inquiry. Concerning mathematical applications too, the majority attitude among scientists is that they are somehow free from the physical world.

Hence, without a very general physical ontology of Categories that are applicable to all existent processes and without deriving the mathematical foundations from these Categories, the sciences and mathematics are in gross handicap. Mathematics is no exception in its applicability to physical sciences. Moreover, pure mathematics too needs the hand of Extension and Change, since these are part of the ontological universals, form their reflections in mind and language, etc., thus giving rise to mathematics.

The exactness within complexity that could be expected of any discourse based on the Categorial implications of To Be can only be such that (1) the denotative terms ‘Extension’ and ‘Change’ may or may not remain the same, (2) but the two dimensions of Extension and Change – that are their aspects in ontological universals – would be safeguarded both physical-ontologically and scientifically.

That is, definitional flexibility and openness towards re-deepening, re-generalizing, re-sharpening, etc. may even change the very denotative terms, but the essential Categorial features within the definitions (1) will differ only meagrely, and (2) will normally be completely the same.

**3. Mathematical “Continuity and Discreteness” Vs. Causal “Continuity”**

The best examples for the above are mathematical continuity and discreteness that are being attributed blindly to physical processes due to the physical absolutization of mathematical requirements. But physical processes are continuous and discrete only in their Causality. This is nothing but Extension-Change-wise discrete causal continuity. At any time, causation is present in anything, hence there is causal continuity. This is finite causation and hence effects finite continuity and finite discreteness. But this is different from absolute mathematical continuity and discreteness.

I believe that it is common knowledge that mathematics and its applications cannot prove Causality directly. What are the bases of the problem of incompatibility of physical causality within mathematics and its applications in the sciences and in philosophy? The main but general explanation could be that mathematical explanations are not directly about the world but are applicable to the world to a great extent.

It is good to note that

**. Hence, mathematical explanations can at the most only show the ways of movement of the processes and not demonstrate whether the ways of the cosmos are by causation.***mathematics is a separate science as if its “objects” were existent, but in fact as non-existent and different from those of any other science – thus creating mathematics into an abstract science in its theoretical aspects of rational effectiveness***(number, number systems, points, shapes, operations, structures, etc.) are all universals / universal qualities / ontological universals that belong to groups of existent things that are irreducibly Extension-Change-type processes. (See below.)**

*Moreover, the basic notions of mathematics*Thus, mathematical notions have their origin in ontological universals and their reflections in mind (connotative universals) and in language (denotative universals). The basic nature of these universals is ‘quantitatively qualitative’. We shall not discuss this aspect here at length.

No science and philosophy can start without admitting that the cosmos exists. If it exists, it is not nothing, not non-entity, not vacuum.

**. This means they have parts. Every part has parts too,***Non-vacuous existence means that the existents are non-vacuously extended**ad libitum*, because each part is extended. None of the parts is an infinitesimal. They can be near-infinitesimal. This character of existents is Extension, a Category directly implied by To Be.Similarly, any extended being’s parts are active, moving.

**. This character of existents is Change. No other implication of To Be is so primary as these. Hence, they are exhaustive of the concept of To Be, which belongs to Reality-in-total. These arguments show us the way to conceive the meaning of causal continuity.***This implies that every part has impact on some others, not on infinite others*Existence in Extension-Change is what we call Causality. If anything is existent, it is causal – hence Universal Causality is the trans-science physical-ontological Law of all existents. By the very concept of finite Extension-Change-wise existence, it becomes clear that no finite space-time is absolutely dense with existents. In fact, space-time is no ontological affair, but only epistemological, and existent processes need measurementally accessible finite space for Change. Hence,

**. Since there is Change and transfer of impact, no existent can be absolutely discrete in its parts or in connection with others.***existents cannot be mathematically continuous*Can logic show the necessity of all existents to be causal? We have already discussed how, ontologically, the very concept of To Be implies Extension-Change and thus also Universal Causality. Logic can only be instrumental in this.

What about the ability or not of logic to conclude to Universal Causality? In my arguments above and elsewhere showing Extension-Change as the very exhaustive meaning of To Be, I have used mostly only the first principles of ordinary logic, namely, Identity, Contradiction, and Excluded Middle, and then argued that

**.***Extension-Change-wise existence is nothing but Universal Causality if everything existing is non-vacuous in existence*For example, does everything exist or not? If yes, let us call it non-vacuous existence. Hence, Extension is the first major implication of To Be. Non-vacuous means extended, because if not extended the existent is vacuous. If extended, everything has parts.

**. In this sense, the basic logical laws do help conclude the causal nature of existents.***Having parts implies distances, however minute, between all the near-infinitesimal parts of any existent process*A point of addition now has been Change. It is, so to say, from experience. But this need not exactly mean an addition.

**. Thus, I am empowered to move to the meaning of Change basically as motion or impact. Naturally, everything in Extension must effect impacts.***If existents have parts (i.e., if they are in Extension), the parts’ mutual difference already implies the possibility of contact between parts*Everything has further parts. Hence,

**. In the physical world this is by finite impact formation. Hence,***by implication from Change and the need for there to be contacts between every near-infinitesimal set of parts of existents, everything causes changes by impacts***. Leibniz’s monads have no significance in the real world.***nothing can exist as an infinitesimal*Thus, we conclude that Extension-Change-wise existence is Universal Causality, and every actor in causation is a real existent, not a non-extended existent, as energy particles seem to have been considered and are even today thought to be, due to their unit-shape yielded merely for the sake mathematical applications. It is thus natural to claim that Causality is a pre-scientific Law of Existence, where

**.***existents are all inwardly and outwardly in Change, i.e., in impact formation – otherwise, the concept of Change would lose meaning*In such foundational questions like To Be and its implications, the first principles of logic must be used, because these are the foundational notions of all science and no other derivative logical procedure comes in as handy. In short, logic with its fundamental principles can help derive Universal Causality. Thus,

**. But the applicability of these three logical Laws is not guaranteed so well in arguments using derivative, less categorial, sorts of concepts.***Causality (Extension-Change) is more primary to experience than the primitive notions of mathematics*I suggest that the crux of the problem of mathematics and causality is the dichotomy between mathematical continuity and mathematical discreteness on the one hand and the incompatibility of applying any of them directly on the data collected / collectible / interpretable from some layers of the phenomena which are from some layers of the object-process in question. Not recognizing the presence of such

**is an epistemological foolishness. Science and philosophy, in my opinion, are victims of this. Thus, for example,***stratificational debilitation of epistemic directness**the***!***Bayesian statistical theory recognizes only a statistical membrane between reality and data*Here I point at the avoidance of the problem of stratificational debilitation of epistemic directness, by the centuries of epistemological foolishness, by reason of the forgetfulness of the ontological and epistemological relevance of expressions like ‘from some layers of data from some layers of phenomena from some layers of the reality’.

This is the point at which it is time to recognize the gross violence against natural reason behind phrases and statements involving ‘data from observation’, ‘data from phenomena’, ‘data from nature / reality’ etc.,

**. As we all know, this state of affairs has gone irredeemable in the sciences today.***without epistemological and ontological sharpness in both science and philosophy to accept these basic facts of nature*The whole of what we used to call space is not filled with matter-energy. Hence, if causal continuity between partially discrete “processual” objects is the case, then the data collected / collectible cannot be the very processual objects and hence cannot provide all knowledge about the processual objects. But mathematics and all other research methodologies are based on human experience and thought based on experience.

This theoretical attitude facilitates and accepts in a highly generalized manner the following three points:

(1) Mathematical continuity (in any theory and in terms of any amount of axiomatization of logical, mathematical, physical, biological, social, and linguistic theories) is totally non-realizable in nature as a whole and in its parts: because (a) the necessity of mathematical approval of any sort of causality in the sciences and by means of its systemic physical ontology falls short miserably in actuality, and (b) the logical continuity of any kind does not automatically make linguistically or mathematically symbolized activity of representation adequate enough to represent the processual nature of entities as derivate from data.

(2) The concept of absolute discreteness in nature, which, as of today, is ultimately of the quantum-mechanical type based on Planck’s constant, continues to be a mathematical and partial misfit in the physical cosmos and its parts, (a) if there exist other universes that may causally determine the constant differently at their specific expansion and/or contraction phases, and (b) if there are an infinite number of such finite-content universes.

The case may not of course be so problematic in non-quantifiable “possible worlds” due to their absolute causal disconnection or their predominant tendency to causal disconnection, but this is a mere common-sense, merely mathematical, compartmentalization: because (a) the aspect of the causally processual connection between any two quanta is logically and mathematically alienated in the physical theory of Planck’s constant, and (b) the possible worlds have only a non-causal existence, and hence, anything may be determined in this world as a constant, and an infinite number of possible universes may be posited without any causal objection!

It is usually not kept in mind here by physicists that the epistemology of unit-based thinking – of course, based on quantum physics or not – is implied by the almost unconscious tendency of symbolic activity of body-minds. This need not have anything to do with a physics that produces laws for all existent universes.

(3) The only viable and thus the most reasonably generalizable manner of being of the physical cosmos and of biological entities is that of existence in an Extended (having parts) and Changing manner (extended entities and their parts impacting a finite number of other existents and their parts in a finite quantity and in a finite duration). Existence in the Extension-Change-wise manner is nothing but causal activity.

Thus, insofar as everything is existent, every existent is causal. There is no time (i.e., no minute measuremental iota of Change) wherein such causal manner of existing ceases in any existent. This is

**. This is not mathematizable in a discrete manner. The concept of geometrical and number-theoretic continuity may apply. But if there are other universes, the Planck constant of proportionality that determines the proportion of content of discreteness may change in the others. This is not previsioned in terrestrially planned physics.***causal continuity between partially discrete processual objects*The attitude of treating everything as causal may also be characterized by the self-aware symbolic activity by symbolic activity itself, in which certain instances of causation are avoided or enhanced, all decrementally or incrementally as the case may be, but not absolutely.

**.***This, at the most, is what may be called freedom*It is fully causal – need not be sensed as causal within a specific set of parameters, but as causal within the context of Reality-in-total. But the whole three millennia of psychological and religious (contemplative) tradition of basing freedom merely on awareness intensity, and not on love – this is a despicable state of affairs, on which a book-length treatise is necessary.

Physics and cosmology even today tend to make the cosmos either (1) mathematically presupposedly continuous, or (2) discrete with defectively ideal mathematical status for causal continuity and with perfectly geometrical ideal status for specific beings, or (3) statistically indeterministic, thus being compelled to consider everything as partially causal, or even non-causal in the interpretation of statistics’ orientation to epistemically logical decisions and determinations based on data. If this has not been the case, can anyone suggest proofs for an alleged existence of a different sort of physics and cosmology until today?

The statistician does not even realize (1) that Universal Causality is already granted by the very existence of anything, and (2) that what they call non-causality is merely the not being the cause, or not having been discovered as the cause, of a specific set of selected data or processes. Such non-causality is not with respect to all existents. Quantum physics, statistical physics, and cosmology are replete with examples for this empirical and technocratic treachery of the notion of science.

A topology and mereologically clean physical ontology of

**, fully free of absolutely continuity-oriented or absolutely discreteness-oriented category theory, geometry, topology, functional analysis, set theory, and logic, are yet to be born. Hence, the fundamentality of Universal Causality in its deep roots in the very concept of the To Be (namely, in the physical-ontological Categories of Extension and Change) of all physically and non-vacuously existent processes, is alien to physics and cosmology until today.***causal continuity between partially discrete processual objects*Non-integer rational numbers are not the direct notion of anything existent. Even a part of a unit process has the attribute ‘unity’ in all the senses in which any other object possesses transpire. For this reason, natural numbers have Categorial priority over rational numbers, because natural numbers are more directly related to ontological universals than other sorts of numbers are. Complex numbers, for example, are the most general number system for their sub-systems defined mathematically, but this does not mean that they are more primary in the metaphysics of ontological universals, since the primary mode of numerically quantitative qualities / universals is that of natural numbers.

**4. Mathematics and Logic within Causal Metaphysics**

Hence, it is important to define the limits of applicability of mathematics to the physics that use physical data (under the species of various layers of their origin). This is the only way to approximate beyond the data and the methodologically derived conclusions beyond the data. As to how and on what levels this is to be done is a matter to be discussed separately.

The same may be said also about logic and language. Logic is the broader rational picture of mathematics. Language is the symbolic manner of application of both logic and its quantitatively qualitative version, namely, mathematics, with respect to specific fields of inquiry. Here I do not explicitly discuss ordinary conversation, literature, etc.

We may do well to instantiate logic as the formulated picture of reason. But human reason is limited to the procedures of reasoning by brains. What exactly is the reason that existent physical processes constantly undergo? How to get at conclusions based on this reason of nature – by using our brain’s reasoning – and thus transcend at least to some extent the limitations set by data and methods in our brain’s reasoning?

If we may call the universal reason of Reality-in-total by a name, it is nothing but Universal Causality. It is possible to demonstrate that Universal Causality is a trans-physical, trans-scientific Law of Existence. This argument needs clarity. How to demonstrate this as the case? This has been done in an elementary fashion in the above, but more of it is not to be part of this discussion.

Insistence on mathematical continuity in nature is a mere idealization. It expects nature to obey our merely epistemic sort of idealizations, that is, in ideal cases based mostly on the brain-interpreted concepts from some layers of data, which are from some layers of phenomena, which are from some layers of the reality under observation. Some of the best examples in science are the suppositions that virtual worlds are existent worlds, dark energy is a kind of propagative energy, zero-value cosmic vacuum can create an infinite number of universes, etc.

The processes outside are vaguely presented primarily by the processes themselves, but highly indirectly, in a natural manner. This is represented by the epistemic / cognitive activity within the brain in a natural manner (by the connotative universals in the mind as reflections of the ontological universals in groups of object processes), and then idealized via concepts expressed in words, connectives, and sentences (not merely linguistic but also mathematical, computerized, etc.) by the symbolizing human tendency (thus creating denotative universals in words) to capture the whole of the object by use of a part of the human body-mind.

The symbolizing activity is based on data, but the data are not all we have as end results. We can mentally recreate the idealized results behind the multitude ontological, connotative, and denotative universals as existents.

As the procedural aftermath of this, virtual worlds begin to “exist”, dark energy begins to “propagate”, and zero-value cosmic vacuum “creates” universes. Even kinetic and potential energies are treated as propagative energies existent outside of material bodies and supposed to be totally different from material bodies. These are mere theoretically interim arrangements in the absence of direct certainty for the existence or not of unobservables.

Insistence on mathematical continuity in nature as a natural conclusion by the application of mathematics to nature is what happens in all physical and cosmological (and of course other) sciences insofar as they use mathematical idealizations to represent existent objects and processes and extrapolate further beyond them. Mathematical idealizations are another version of linguistic symbolization and idealization.

Logic and its direct quantitatively qualitative expression as found in mathematics are, of course, powerful tools. But, as being part of the denotative function of symbolic language, they are tendentially idealizational. By use of the same symbolizing tendency, it is perhaps possible to a certain extent to

**the side-effects of the same symbols in the language, logic, and mathematics being used in order to symbolically idealize representations.***de-idealize*Merely mathematically following physical nature in whatever it is in its part-processes is a debilitating procedure in science and philosophy (and even in the arts and humanities), if this procedure is not de-idealized effectively.

**Our language, logic, and mathematics too do their functions well, although they too are equally unable to capture the whole of Reality in whatever it is, wholly or in parts, far beyond the data and their interpretations!***If this is possible at least to a small and humble extent, why not do it?**Why not de-idealize the side-effects of mathematics too?*This theoretical attitude of partially de-idealizing the effects of human symbolizing activity by use of the same symbolic activity accepts the existence of processual entities as whatever they are.

**– of course, different from and more generalized than those of Quine and others. Perhaps such a generalization can give a slightly better concept of reality than is possible by the normally non-self-aware symbolic activity in language, logic, and mathematics.***This is what I call ontological commitment***5. Mathematics, Causality, and Contemporary Philosophical Schools**

With respect to what we have been discussing,

**and even its more recent causalist child, namely,***linguistic philosophy***, have even today the following characteristics:***dispositionalist causal ontology*(1) They attribute an even now overly discrete nature to “entities” in the extent of their causal separateness from others while considering them as entities. The ontological notion of an object or even of an event in its unity in analytic philosophy and in particular in modal ontology forecloses consideration of the process nature of each such unity within, on par with interactions of such units with one another. (David Lewis,

*Parts of Classes*, p. vii) This is done without ever attempting to touch the deeply Platonic (better, geometrically atomistic) shades of common-sense Aristotelianism, Thomism, Newtonianism, Modernism, Quantum Physics, etc., and without reconciling the diametrically opposite geometrical tendency to make every physical representation continuous.(2) They are logically comatose about the impossibility of the exactly referential definitional approach to the processual demands of existent physical objects without first analyzing and resolving the metaphysical implications of existent objects, namely, being irreducibly in finite Extension and Change and thus in continuous Universal Causality in finite extents at any given moment.

(3) They are unable to get at the

**(neither mathematically continuous nor geometrically discontinuous) nature of the physical-ontologically “partially discrete” processual objects in the physical world, also because they have misunderstood the discreteness of processual objects (including quanta) within stipulated periods as typically universalizable due to their pragmatic approach in physics and involvement of the notion of continuity of time.***causally fully continuous***has done a lot to show the conceptual structures of ordinary reasoning, physical reasoning, mathematical and logical thinking, and reasoning in the human sciences. But due to its lack of commitment to building a physical ontology of the cosmos and due to its purpose as a research methodology, phenomenology has failed to an extent to show the nature of causal continuity (instead of mathematical continuity) in physically existent, processually discrete, objects in nature.**

*Phenomenology***has just followed the human-scientific interpretative aspect of Husserlian phenomenology and projected it as a method. Hence, it was no contender to accomplish the said fete.**

*Hermeneutics***qualified all science and philosophy as being perniciously cursed to be “modernistic” – by thus monsterizing all compartmentalization, rules, laws, axiomatization, discovery of regularities in nature, logical rigidity, and even metaphysical grounding as insurmountable curses of the human project of knowing and as a synonym for all that are unapproachable in science and thought. The linguistic-analytic philosophy in later Wittgenstein too was no exception to this nature of postmodern philosophies – a matter that many Wittgenstein followers do not notice. Take a look at the first few pages of Wittgenstein’s**

*Postmodern philosophies**Philosophical Investigations*, and the matter will be more than clear.

**seem today to follow the beaten paths of extreme pragmatism in linguistic-analytic philosophy, physics, mathematics, and logic, which lack a**

*The philosophies of the sciences***.**

*foundational concept of causally concrete and processual physical existence*Hence, it is useful for the growth of science, philosophy, and humanities alike to research into the

**and forget about absolute mathematical continuity or discontinuity in nature. Mathematics and the physical universe are to be reconciled in order to mutually delimit them in terms of the causal continuity between partially discrete processual objects.***causal continuity between partially discrete “processual” objects*Bibliography

*(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology*, 647 pp., Berlin, 2018.

*(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology*, 386 pp., Frankfurt, 2015.

*(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology*, 361 pp., Frankfurt, 2014.

*(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology*, 92 pp., KDP Amazon, 2022, 2nd Edition.

*(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie*, 104 pp., KDP Amazon, 2022, 1st Edition.

Insistence on mathematical continuity in nature is a mere idealization. It expects nature to obey our idealization. This is what happens in all physical and cosmological (and of course other) sciences as long as they use mathematical idealizations to represent existent objects / processes.

But mathematically following nature in whatever it is in its part-processes is a different procedure in science and philosophy (and even in the arts and humanities). This theoretical attitude accepts the existence of processual entities as what they are.

This theoretical attitude accepts in a highly generalized manner that

(1) mathematical continuity (in any theory and in terms of any amount of axiomatization of physical theories) is not totally realizable in nature as a whole and in its parts: because the necessity of mathematical approval in such a cosmology falls short miserably,

(2) absolute discreteness (even QM type, based on the Planck constant) in the physical cosmos (not in non-quantifiable “possible worlds”) and its parts is a mere commonsense compartmentalization from the "epistemology of piecemeal thinking": because the aspect of the causally processual connection between any two quanta is logically and mathematically alienated in the physical theory of Planck’s constant, and

(3) hence, the only viable and thus the most reasonably generalizable manner of being of the physical cosmos and of biological entities is that of CAUSAL CONTINUITY BETWEEN PARTIALLY DISCRETE PROCESSUAL OBJECTS.

PHYSICS and COSMOLOGY even today tend to make the cosmos mathematically either continuous or defectively discrete or statistically oriented to merely epistemically probabilistic decisions and determinations.

Can anyone suggest here the existence of a different sort of physics and cosmology that one may have witnessed until today? A topology and mereology of CAUSAL CONTINUITY BETWEEN PARTIALLY DISCRETE PROCESSUAL OBJECTS, fully free of discreteness-oriented category theory and functional analysis, is yet to be born.

Hence, causality in its deep roots in the very concept of To Be is alien to physics and cosmology till today.

Bibliography

*(1) Gravitational Coalescence Paradox and Cosmogenetic Causality in Quantum Astrophysical Cosmology*, 647 pp., Berlin, 2018.

*(2) Physics without Metaphysics? Categories of Second Generation Scientific Ontology*, 386 pp., Frankfurt, 2015.

*(3) Causal Ubiquity in Quantum Physics: A Superluminal and Local-Causal Physical Ontology*, 361 pp., Frankfurt, 2014.

*(4) Essential Cosmology and Philosophy for All: Gravitational Coalescence Cosmology*, 92 pp., KDP Amazon, 2022, 2nd Edition.

*(5) Essenzielle Kosmologie und Philosophie für alle: Gravitational-Koaleszenz-Kosmologie*, 104 pp., KDP Amazon, 2022, 1st Edition.

i 'red' in a maths popularization book of steven strogatz that 1+3=4, 1+3+5=9, 1+3+5+7=16, and so on; wich would be the hypothesis when trying to demonstrate this striking 'fact'?

Physics

The physicist betting that space-time isn't quantum after all

Most experts think we have to tweak general relativity to fit with quantum theory. Physicist Jonathan Oppenheim isn't so sure, which is why he’s made a 5000:1 bet that gravity isn’t a quantum force

By Joshua Howgego

13 March 2023

📷

Nabil NEZZAR

JONATHAN OPPENHEIM likes the occasional flutter, but the object of his interest is a little more rarefied than horse racing or the one-armed bandit. A quantum physicist at University College London, Oppenheim likes to make bets on the fundamental nature of reality – and his latest concerns space-time itself.

The two great theories of physics are fundamentally at odds. In one corner, you have general relativity, which says that gravity is the result of mass warping space-time, envisaged as a kind of stretchy sheet. In the other, there is quantum theory, which explains the subatomic world and holds that all matter and energy comes in tiny, discrete chunks. Put them together and you could describe much of reality. The only problem is that you can’t put them together: the grainy mathematics of quantum theory and the smooth description of space-time don’t mesh.

Most physicists reckon the solution is to “quantise” gravity, or to show how space-time comes in tiny quanta, like the three other forces of nature. In effect, that means tweaking general relativity so it fits into the quantum mould, a task that has occupied researchers for almost a century already. But Oppenheim wonders if this assumption might be mistaken, which is why he made a 5000:1 bet that space-time isn’t ultimately quantum.

The quantum experiment that could prove reality doesn't exist

................................................................................

This is actually a trivial question and I'm just being mischievous.

It turns on the shades of meaning of both "idea" and "exist."

Mathematically, a concept exists whether anyone has happened upon it or not. (A meaningless attempt at a concept is not a concept).

When first thought about by an actual brain of any kind, a concept acquires its first glimmer of existence as in the real world.

Is it possible to mathematically calculate K-40 from K total determined by ICP MS in sediment samples?

Physics is a scence of representations, with mathematical aspects in them, foremost and not of naked correlations and parameter analysis.

It also has competent conceptualizaions, genious principles.

Even the innocent seeking uniform motion is a representational scheme fo motions under the theory of kinematics. (Representations are seperate from reality but are invaluable part of scientific infering, predicting, explaining etc) i.e heat is represented as a flow between subsystems. Representations change i.e Einstein found the curved spacetime one for gravity phenomena.

Physics is also the science f cosmology. It has no meaning if it bypasses the universe-i.e the sum of subsystems. This discipline has problems because we cannot take ourself out of it and study it but physics has tools for this (QM) or theoretical approximaions (more cognitively open consideration of the concept of boundary conditions).

We assume that this statement is false, but one of the most common mathematical errors.

So a question arises: what is the importance of the LHS diagonal?

I have two networks, and wish to get them to dynamically interact with one another, yet retain modularity.

Category theory is a branch of mathematics that deals with the abstract structure of mathematical concepts and their relationships. While category theory has been applied to various areas of physics, such as quantum mechanics and general relativity, it is currently not clear whether it could serve as the language of a metatheory unifying the description of the laws of physics.

There are several challenges to using category theory as the language of a metatheory for physics. One challenge is that category theory is a highly abstract and general framework, and it is not yet clear how to connect it to the specific details of physical systems and their behaviour. Another challenge is that category theory is still an active area of research, and there are many open questions and debates about how to apply it to different areas of mathematics and science.

Despite these challenges, there are some researchers who believe that category theory could play a role in developing a metatheory for physics. For example, some have proposed that category theory could be used to describe the relationships between different physical theories and to unify them into a single framework. Others have suggested that category theory could be used to study the relationship between space and time in a more unified and conceptual way.

I am very interested in your experiences, opinions and ideas.

Is it mathematically justified to place negative and positive numbers on the same plane?

This question discusses the YES answer. We don't need the

**√-1.**The complex numbers, using rational numbers (i.e., the Gauss set G) or mathematical real-numbers (the set R), are artificial. Can they be avoided?

Math cannot be in ones head, as explains [1].

To realize the YES answer, one must advance over current knowledge, and may sound strange. But, every path in a complex space must begin and end in a rational number -- anything that can be measured, or produced, must be a rational number. Complex numbers

**are not**needed, physically, as a number. But, in algebra, they**are**useful.The YES answer can improve the efficiency in using numbers in calculations, although it

**is**less advantageous in algebra calculations, like in the well-known Gauss identity.For example, in the FFT [2], there is no need to compute complex functions, or trigonometric functions.

This may lead to further improvement in computation time over the FFT, already providing orders of magnitude improvement in computation time over FT with mathematical real-numbers. Both the FT and the FFT are revealed to be equivalent -- see [2].

I detail this in [3] for comments. Maybe one can build a faster FFT (or, FFFT)?

The answer may also consider further advances into quantum computing?

[2]

Preprint FT = FFT

[2]

Preprint The quantum set Q*

I noticed that in some very bad models of neural networks, the value of R² (coefficient of determination) can be negative. That is, the model is so bad that the mean of the data is better than the model.

In linear regression models, the multiple correlation coefficient (R) can be calculated using the root of R². However, this is not possible for a model of neural networks that presents a negative R². In that case, is R mathematically undefined?

I tried calculating the correlation y and y_pred (Pearson), but it is mathematically undefined (division by zero). I am attaching the values.

Obs.: The question is about artificial neural networks.

1 - Prof. Tegmark of MIT hypothesizes that the universe is not merely described by mathematics but IS mathematics.

2 - The Riemann hypothesis applies to the mathematical universe’s space-time, and says its infinite "nontrivial zeros" lie on the vertical line of the complex number plane (on the y-axis of Wick rotation).

3 - Implying infinity=zero, there's no distance in time or space - making superluminal and time travel feasible.

4 - Besides Mobius strips, topological propulsion uses holographic-universe theory to delete the 3rd dimension (and thus distance).

5 - Relationships between living organisms can be explained with scientifically applied mathematics instead of origin of species by biological evolution.

6 - Wick rotation - represented by a circle where the x- and y-axes intersect at its centre, and where real and imaginary numbers rotate counterclockwise between 4 quadrants - introduces the possibility of interaction of the x-axis' ordinary matter and energy with the y-axis' dark matter and dark energy.

Theoretical and computational physics provide the vision and the mathematical and computational framework for understanding and extending the knowledge of particles, forces, space-time, and the universe. A thriving theory program is essential to support current experiments and to identify new directions for high energy physics. Theoretical physicists provide a great deal of assistance to the Energy, Intensity, and Cosmic Frontiers with the in-depth understanding of the underlying theory behind experiments and interpreting the outcomes in context of the theory. Advanced computing tools are necessary for designing, operating, and interpreting experiments and to perform sophisticated scientific simulations that enable discovery in the science drivers and the three experimental frontiers.

source: HEP Theoretical and Computationa... | U.S. DOE Office of Science (SC) (osti.gov)

What is this mean ( ± 0.06) and How can I calculate it mathematically?

All tests doing a proof for the Riemann-Hypothesis on the Zeta-Function must fail.

There are no zeros by the so called function of a complex argument.

A function on two different units f(x, y) only then has values for the third unit

`z´ [z = f(x, y)]

if the values variables `x´ and `y´ would be combined by an algebraic rule.

So it should be done for the complex argument, Riemann had used.

But there isn´t such a combination. So Riemann only did a `scaling´. Where both parts of the complex number stay separate.

The second part of the refutation comes by showing wrong expert opinion of mathematics. This is on the false use of `imaginary´ and `prefixed multiplication´.

What are the properties of transversal risks in networks? Happy for applied examples and diffusion properties.

Project Name - Improving Achievement and Attitude through Co-operative learning in F. Y. B. Sc. Mathematics Class

What is missing is an exact definition of probability that would contain time as a dimensionless quantity woven into a 3D geometric physical space.

It should be mentioned that the current definition of probability as the relative frequency of successful trials is primitive and contains no time.

On the other hand, the quantum mechanical definition of the probability density as,

p(r,t)=ψ(r,t)*.ψ(r,t),

which introduces time via the system's destination time and not from its start time is of limited usefulness and leads to unnecessary complications.

It's just a sarcastic definition.

It should be mentioned that a preliminary definition of the probability function of space and time proposed in the Cairo technique led to revolutionary solutions of time-dependent partial differential equations, integration and differentiation, special functions such as the Gamma function, etc. without the use of mathematics.

Hello

I have an Excel file containing weather data of Missouri in U.S. The data starts from 25th July and ends on 9th September in 2014. For each day, almost 21 times data has been recorded (6 hours within solar noon time).

How can I make a type99 source file using this Excel file? I already have studied mathematical reference of Trnsys help, but that was not very helpful. Thanks

The Gamma function,

G(n)= Integral from 0 to infinity [Exp(e^-x^n)]dx

is of the great mathematical and physical importance.

It can be calculated without numerical integration (for practical purposes) via its mathematical and physical properties:

i-minimum of Gamma occurs at x = 1.4616321 and the corresponding value of Gamma(x) is 0.8856032.

ii-Gamma(1.)=Gamma(2.)=1.

iii-Gamma(x)=(x-1.) !

A simple preliminary approach that gives the value of Gamma(x) with an error less than 0.001 is the second-order polynomial expression for the factorial x,

(1.-0.46163*x+0.46163*x*x),x element of [0,1].

For example, this gives:

G(10.5)=11877478.

vs the value of 11899423.084) given by numerical integration.

and Gamma(1.4616)= 0.88527 vs 0.8856032.

There are a few point to consider in this issue

Points pro current emphasis

1. Math is the backbone of a physical theory. Good representation, good quantities of a theory, phenomena but bad math makes for bad theory

2. There is a general skepticism for reconsidering role of mathematized approach in physics Masters syllabi/upgrading role of literature/essay

2. Humans communicate, learn, think & develop construct via language

Arguments Con

1. Math is the elements in theory and "physics product" that is responsible for precision& prediction. Indespensible though, it exists in the mind of some individuals & function as well, in parallel with conception, physical arguments

2. Not all models in physics are mathematical. Some are conceptual

3. Formulations of solutions to physics problems via math techniques and methods is def of mathematical physics. However, this is a certain % of domain of skills.

But syllabus focuses 100% on this

Dear professors and students, greetings and courtesy. I wanted to know if the real numbers are the largest and the last set of numbers that exist, or if there are sets or sets of numbers that are larger than that, but maybe they have not been discovered yet? Which is true? If it is the last set of numbers that exists, what theorem proves the non-existence of a set of numbers greater than it? And if there is a larger set than that, in terms of the history of mathematics, by obtaining the answer to which mathematical problem, it was proved that the obtained answer is not closed with respect to the set of complex numbers and belongs to a larger set? Thank you very much

As the concept comes from the Bernoulli numbers and different branches of mathematics, I have recently considered the importance of introducing the same concept, 'The unity of mathematics' within the context of the Bernoulli numbers and some special series (the Flint Hills and Cookson Hills series). I believe in the scenario of defining a balanced relationship between the effect of the Bernoulli numbers and the series of hard convergence.

I am pointing out this potential link.

For a general conclusion about what I consider should the concept of 'unity' by the Bernoulli numbers and the Flint Hills, just pay attention to this screenshot:

DOI: 10.13140/RG.2.2.16745.98402

Which software is best for making high-quality graphs? Origin or Excel? Thank you

Mathematics Teacher Educators (MTEs) best practices.

I'm interesting in research literature about Mathematics Teacher Educators (MTEs) best practices, especially on MTEs' practices for teaching to solve problems.

thank you.

Good day, Dear Colleagues!

Anyone interested in discussing this topic?

How can I define histogram bins in a well define mathematical expression especially driven from data points x_i, i=1,..,n and the range or any other well define measures in the dataset.

Kindly share with me any details of Scopus indexed Mathematics conferences in India.

Physics continues a tradition of assesment in graduate program based on final exams and of the form of mathematized exersices with no conceptual qs or essays.

This fulfils the aim. Of. Mastering demanding nomenclature in the domain. Given slow progress in field last decades this might be a good alternative but there are also pedagogical reasons.

This form of assesment is extreme and outdated.it has further disadvantages

** Students do not develop critical research skills such as literature analysis and research.

**certain skills for future researcher are notvtested i. E ability to combine research from different Source, ability to think critically of competing thesis or theories, to discern gaps in current research

** A mixed approach should ensure all aims