Science topic

Uncertainty - Science topic

Uncertainty is the condition in which reasonable knowledge regarding risks, benefits, or the future is not available.
Questions related to Uncertainty
  • asked a question related to Uncertainty
Question
3 answers
Advancing Supplier Selection: Evaluating Fuzzy MCDM vs. Quantum-Inspired Optimization in High-Dimensional, Uncertain Decision Spaces
Supplier selection remains a highly complex multi-criteria decision-making (MCDM) challenge, particularly in environments characterized by uncertainty, incomplete information, and dynamic market fluctuations. Traditional fuzzy MCDM methods (e.g., Fuzzy AHP, Fuzzy TOPSIS, Fuzzy ITARA) have been widely adopted for their ability to handle linguistic imprecision and expert-driven evaluations, making them well-suited for procurement decision-making. However, the advent of quantum-inspired optimization (QIO) algorithms, which exploit principles of quantum superposition, entanglement, and probabilistic heuristics, presents an alternative paradigm that claims superior efficiency in solving high-dimensional combinatorial optimization problems.
This raises several critical academic and methodological challenges that remain underexplored:
1. Computational Complexity vs. Decision Interpretability
While QIOs claim exponential speedup in solving large-scale supplier selection problems, their lack of explainability and interpretability could hinder practical adoption in procurement. To what extent can QIOs outperform traditional fuzzy MCDM methods in real-world procurement applications, considering the trade-offs between computational complexity, solution quality, and decision transparency?
2. Uncertainty Representation in Supplier Selection Models
Fuzzy logic excels at modeling qualitative uncertainty and linguistic vagueness, whereas quantum-inspired approaches rely on probabilistic distributions and non-classical optimization heuristics. Can QIOs effectively model vague, preference-driven supplier evaluation criteria, or do they require hybridization with fuzzy-based uncertainty modeling to enhance robustness?
3. Empirical Benchmarking and Industrial Feasibility
Despite theoretical claims, empirical studies comparing QIOs with fuzzy MCDM methods in procurement remain scarce. What are the empirical performance benchmarks in terms of solution convergence, computational efficiency, and procurement cost optimization, particularly in real-world datasets? Given the limitations of Noisy Intermediate-Scale Quantum (NISQ) computing, how practical is QIO for procurement optimization, and what are the short-term and long-term adoption barriers?
4. Hybrid Fuzzy-Quantum Decision Frameworks
Could a hybrid fuzzy-quantum model bridge the interpretability-efficiency gap by leveraging the strengths of fuzzy logic in linguistic modeling while incorporating the computational advantages of quantum-inspired heuristics? What hybrid methodologies could be explored to enhance procurement decision robustness in dynamic, multi-stakeholder supply chain environments?
Given these open research questions, I invite scholars and practitioners specializing in quantum computing, fuzzy logic, procurement science, uncertainty modeling, and AI-driven supply chain optimization to share insights, empirical findings, and theoretical advancements on:
  • The fundamental trade-offs between fuzzy logic-based MCDM and QIO in procurement decision-making.
  • Comparative studies or case studies benchmarking these methodologies in real-world supplier selection problems.
  • The feasibility of integrating QIOs into procurement decision systems given current technological constraints.
  • Hybrid approaches that combine fuzzy MCDM with quantum-inspired heuristics for superior decision-making under uncertainty.
Relevant answer
Answer
Dear Doctor
Go To
A Quantum-inspired Ant Colony Optimization for solving a sustainable four-dimensional traveling salesman problem under type-2 fuzzy variable
  • January 2023
  • Advanced Engineering Informatics 55(9):101816
  • DOI: 10.1016/j.aei.2022.101816
  • Madhushree Das, Arindam Roy, Samir Maity, Samarjit Kar
[Abstract
In this paper, a Quantum-inspired Ant Colony Optimization (Qi-ACO) is proposed to solve a sustainable four-dimensional traveling salesman problem (4DTSP). In 4DTSP, various paths with a different number of conveyances are available to travel between any two cities. In this model, we have considered a sustainable 4DTSP in terms of emission as a constraint. Since travel costs and emissions are uncertain/imprecise in nature, so here we consider type-2 variables. Sustainable development in the traveling salesman problem (TSP) sector can be divided into two major sections: economy and environmental. Sustainable TSP development requires balancing to achieve the maximum benefits for these two sectors. For increasing development in sustainable transportation, we need to use some strategies for increasing sustainability. These strategies include improving route and vehicle selection, routing plan, vehicle speed, etc. The novelties of the proposed Qi-ACO algorithm are (i) Qubit generated based on the amount of emission of the vehicle as well as travel cost between two cities, (ii) pheromone initialized and updated depends on the qubit, (iii) quantum-inspired technique makes fast computation. The proposed sustainable 4DTSP is illustrated with some numerical data. The defuzzification of type-2 fuzzy variable based on the Critical value (CV) method is used in this model. The supremacy of the proposed method is established through some statistical tests. The proposed algorithm and its modified form can be easily adapted in ship routing, supply chain problems, and other fields.]
  • asked a question related to Uncertainty
Question
1 answer
Climate Extremes
If we could still have the occurrence of a wide variety of natural weather and climate extremes, even, if there are no anthropogenic changes in a climate, given the fact that many weather and climate extremes remain to be the result of natural climate variability, while, natural decadal or multi-decadal variations in the climate provide the backdrop for anthropogenic climate changes, then, what is the physical basis on which a changing climate leading to changes in the frequency, intensity, spatial extent, duration and timing of weather and climate extremes, and in turn, resulting in unprecedented extremes results only from ‘anthropogenic changes in climate’?
Are we scientifically ‘clear and distinct’ about the exact factors that influence the observed changes and in turn, the projected changes in climate; if changes in extremes of a climate or weather variable remain no more related in a simple way ‘always’ to changes in the mean of the same variable (and given the uncertainty in the natural variability of climate; uncertainties in climate model parameters and structure; and uncertainty in the projections of future emissions)?
Suresh Kumar Govindarajan, Professor [HAG]
IIT Madras, 24-Dec-2024
Relevant answer
Answer
IN VAIN he established the international forum for this type of wuestoions it appears
ADVICE: goolge first before going to ResearchGate
  • asked a question related to Uncertainty
Question
1 answer
I recently added an AI-generated summary to my article "Intra-industry diversification effects under firm-specific contingencies on the demand side" (2020, Florian Smeritschnig, Jakob Muellner, Phillip C. Nell, Martin Weiss) by including it as a supplementary resources. I did this to give those who do not have open access to the full text a brief understanding of the main points of my article.
The problem is that the title is still reflected on the article's subpage under "Linked data", but there is actually no file available and no bottom to delete it. Could you please delete the title?
The second question is whether it's possible to upload an additional file - e.g. the AI-generated summary of the article, but not the public or private full text - to the same article page, but without it appearing as a completely separate entry in the list of my other research items? For example, I have the paper "Electoral uncertainty and the multinational corporation a conceptualization, firm-level effects and strategies" (2024) by Puck, Muellner & Reinprecht. I added an AI-generated summary to this publication as a supplementary resources, and this summary is now among my other research items, along with the original article itself. I don't want to have two entries with the same title, it creates unnecessary confusion, but I do want the AI-generated summary to appear only in the research item entry. Is there a way to do this?
Best,
Jakob
Relevant answer
Answer
Please note that you wrote to the ResearchGate community, not to the RG team. The following help pages may be useful:
"How do I edit my research item's details?" in https://help.researchgate.net/hc/en-us/articles/14293081125777-Reviewing-editing-and-featuring-your-research (you may edit entries of the type "Data" the same way as publication entries)
You may add more than one "full-text" to a research item.
  • asked a question related to Uncertainty
Question
2 answers
How can large-scale phylogenomic datasets be improved to reduce uncertainty in tree resolution?
Relevant answer
Answer
Hi David,
I haven't operated many whole-genome phylogenies but the followings may be of some help:
The error of genomic phylogeny may be caused by the mistakes of alignments, the introgression/ILS phenomenon of some genes, and the search methods of phylogenetic reconstructions.
In the first case, perhaps it is wise to apply a strong trimming method to the sequence alignment, or use non-alignment ways. I've heard an invention of Zhejiang University - a phylogenetic strategy to construct trees using the cluster of gene blocks, not that of whole sequences. Otherwise you may map your dataset to a reference genome, extracting SNP sites of each gene and use sequences straightly constructed by SNPs to do tree search. In that way, you can also lighten your computation burden because each chromosome can be reduced to SNP sites below about 30000 bp before tree reconstrcution.
In the second case, do coalescent phylogenies, or only keep highly conservative regions can be a good idea. You can select those CDS regions ahead by analysing reference genomes and use RNA-seq to extract those regions.
In the third case, applying rapid and accurate tree search methods, or switch to advabced hardwares, so that you can do more times of search to find the best tree.
  • asked a question related to Uncertainty
Question
1 answer
Gold prices declined in November 2024 due to several economic and geopolitical factors. A significant driver was the strength of the U.S. dollar, which made gold more expensive for international investors holding other currencies. Additionally, investors anticipated a potential interest rate cut by the Federal Reserve, which created a sense of uncertainty in the gold market.
This trend was also influenced by caution among investors ahead of the U.S. presidential elections, as some temporarily shifted away from gold as a safe-haven asset. Moreover, high U.S. bond yields continued to add pressure on gold prices by providing investors with alternative returns.
In summary, the primary reasons behind gold's price decline in November included the strong dollar, possible changes in interest rates, and shifts in investor sentiment amidst global economic stability
Relevant answer
Answer
The main reason for the fall in the price of gold at the end of 2024 was the election of President Trump, which led to a rise in the value of the dollar as investors anticipated US economic growth, given Trump's aggressive and protectionist economic policy. But also because of a rise in interest rates via the FED's inflation control policy. There is therefore an arbitrage opportunity in terms of yield for offensive or moderate investors, who will tend to abandon gold in the short term in favour of these other promising niches.
  • asked a question related to Uncertainty
Question
4 answers
Aviation MRO handles both planned and unscheduled aircraft maintenance. An independent MRO often rely on information coming from their clients or customers, which consist of maintenance request, parts demands, etc. The MRO needs to ensure that the aircraft spare parts are always available whenever needed but still needs to be evaluated to avoid any understock or overstock inventory situation. Forecasting demand of the spare parts is the common method used; however, there are still rooms for improvement needed due to several constraints, such as limitations in accessing data, collaboration complexity, and demand uncertainty.
Relevant answer
Thank you, Dave Dave Price
As a component MRO trader, I believe that if the MRO has access to customer data, including fleet sizes (both present and future), operational data (to examine seasonal factors and utilisation), and other data, they can optimise their inventory stocks. What do you think about this
  • asked a question related to Uncertainty
Question
14 answers
In a paper I read:
"In addition to the point estimate, a confidence interval is commonly used to account for the uncertainty in the ML estimator."
Would the scholars attending the Research Gate agree or comment on that?
Thank you
MS
Relevant answer
Answer
@ Pablo Constantino
You say:
  • This interval gives a likely range where the true value lies, acknowledging that the estimator fluctuates across different samples.
It is wrong.
  • asked a question related to Uncertainty
Question
3 answers
What personal psychological characteristics does intolerance of uncertainty affect?
analysis of strategies where intolerance of uncertainty is beneficial for the individual.
I appreciate any contributions and ideas to my study.
thanks for your answers
Relevant answer
Muchas gracias por tu respuesta, estudio en mi investigación sobre este constructo, estaría interesada en cualquier información sobre lo que me comentas, no tengo conocimientos sobre psicología budista, investigaré sobre el tema, estaría agradecida de todo cuanto me puedas indicar para ampliar mis conocimientos.
gracias nuevamente.
un cordial saludo.
  • asked a question related to Uncertainty
Question
2 answers
Dear all, When modeling uncertainties that affect measurements (e.g., a continuous variable), how can we also account for the uncertainty associated with the Gold Standard (e.g., reference measurements designed by experts) for the item being measured?
Thank you so much for your help and for fetching some reference papers/books
Cordially,
Hubert
Relevant answer
Answer
Thank you very much Tom for your nice answer and these very interesting references.
To clarify a bit.
I designed a Gold Standard in asking experts to measure a set of N tumors. therefore, I obtained the mean value of the Gold Standard and the associated variability of the Gold Standard.
In another hand, I run multiple algorithm to measure the same set of N tumors to obtain an estimate of the mean value of the tumor along the associated variability of these measurements.
My final aims it to compute a transformation (linear) able to transform a measurement (knowing its previously estimated CI) to an estimate of the truth with associated CI.
I hope this helps and thank you again,
Cordially,
Hubert
  • asked a question related to Uncertainty
Question
3 answers
The determination of our destiny depends on the decisions we make at each moment, which in a sense are conditioned by the Uncertainty Principle.
So... the determination of our destiny depends on the decisions we make at any given moment, which in a certain sense are conditioned by the Uncertainty Principle because God does love to play dice.
In order to determine the state of our situation we must take into account that we have many alternatives which creates several possibilities for our future precisely because of that Uncertainty Principle.
But the fact of the many possibilities should not lead us to think of “other universes” but of the alternatives we have at every moment to create or destroy...our own universe.
This is one of the reasons we have as mankind to make a real paradigm shift in all fields of our existence.
Edgar
Relevant answer
Answer
Dear all,
We know wel where it comes from The Uncertainty Principle. Few of us know why he is brilliant. Projected onto our dimension, in principle we shouldn't be talking about The Uncertainty Principle. In our dimension, taking into account the relevant knowledge, adaptation, behavioural form or other factor it is not necessary to apply this principle.
So, striving for sincere, pure knowledge and behavior that brings harmony to the individual and at the same time to society. In principle, our existence will not depend on any Uncertainty Principle.
Regards,
Laszlo
  • asked a question related to Uncertainty
Question
2 answers
In general, the greater the environmental uncertainty, the more attention management in organizations must direct towards the external environment.
Relevant answer
Answer
Dear Taakea Tebotua,
Environmental uncertainty refers to the degree of unpredictability of changes in an organization's environment that may affect its operations. Two key dimensions of environmental uncertainty are complexity and volatility. Complexity refers to the number of factors affecting an organization and their interrelationships, while variability refers to the speed and unpredictability of changes in these factors.
In the context of corporate social responsibility (CSR), a company's implementation of climate and environmental goals directly improves its image in the eyes of customers and clients. A company that emphasizes sustainability in its strategy becomes more attractive to consumers, who increasingly prefer products and services from companies that care about the environment. Communicating that a company is “green” and promoting its environmental efforts not only builds its reputation, but also attracts investors and contractors interested in working with responsible partners.
As part of the plan for the green transformation of the economy, the company's implementation of measures to reduce CO2 emissions and adapt to new technologies related to renewable energy sources is becoming a necessity. Sustainability in this context contributes to saving the planet's climate and biosphere, which further strengthens the company's positive image in the eyes of society. The correlation between running a responsible business and its market success is based on the synergy between environmental activities and the long-term sustainability and competitiveness of the company. This also shows that companies contributing to the energy transition, including the change to greener processes, become leaders in their industry and influence the creation of standards in sustainable business.
I pointed out various aspects of this important issue for the future of the planet, the future of the planet's climate and biosphere, and for the future of future generations of people in my article:
IMPLEMENTATION OF THE PRINCIPLES OF SUSTAINABLE ECONOMY DEVELOPMENT AS A KEY ELEMENT OF THE PRO-ECOLOGICAL TRANSFORMATION OF THE ECONOMY TOWARDS GREEN ECONOMY AND CIRCULAR ECONOMY
I invite you to join me in scientific cooperation,
Kind regards,
Dariusz Prokopowicz
  • asked a question related to Uncertainty
Question
6 answers
How does uncertainty differ from model performance evaluation using statistical metrics such as MSE, RMSE, MAE, MAPE, and R² in the field of real estate appraisal?
Relevant answer
Answer
ALL of the measures cited are just mathematical calculations that indicate a particular kind of variation, or uncertainty, in a data set. There is no specific rule on which one you should use apart from the constraints (e.g. linearity) of some of the methods.
Uncertainty in extrapolations degrades quickly with future time. These calculations may be part of the assessment of a real estate appraisal methodology, but they are not directly comparable to the complete "Model Performance Evaluation" (unless you define it as the sole criterion).
  • asked a question related to Uncertainty
Question
4 answers
In this rapidly changing world, where cause and consequence are intertwined through complex psychological layers, effective decision-making becomes crucial. The fear of career uncertainties, lack of intellectual confidence, or facing immense barriers affects educational, academic, or professional choices. When certain principles are applied, crucial decisions may result in better outcomes. Since decisions are made under the influence of objective and subjective perspectives, their sufficiency and reliability can greatly vary. There are multiple frameworks used in effective decision-making. One of them is the VIPS framework: Values, Interests, Personality, and Skills. In your opinion, how can this framework be used to foster effective decision-making?
Relevant answer
Answer
Thank you, Stephen I. Ternyik.
  • asked a question related to Uncertainty
Question
3 answers
2) How is the formation of the universe?
The universe, at its most fundamental level, appears to operate according to the principles of quantum mechanics, where uncertainty and indeterminacy play key roles in shaping its evolution. In classical computational theory, Turing’s Halting Problem demonstrates that it is impossible to predict whether a system will reach a final state or run indefinitely. This raises profound questions about the nature of the universe: could it, too, one day halt, reaching a state where no further evolution is possible? However, the inherent unpredictability of quantum mechanics—through phenomena like superposition, quantum fluctuations, and entanglement—may offer a safeguard against such a scenario. This paper explores the intersection of quantum mechanics and the Halting Problem, suggesting that quantum uncertainty prevents the universe from settling into a static, final state. By continuously introducing randomness and variation into the fabric of reality, quantum processes ensure the universe remains in perpetual motion, avoiding a halting condition. We will examine the scientific and philosophical implications of this theory and its potential to reshape our understanding of cosmology.
Relevant answer
Answer
The evolution of the universe, from the inflationary epoch onwards, is described by classical, not quantum, gravity.
  • asked a question related to Uncertainty
Question
3 answers
How do new uncertainty relations impact our understanding of quantum mechanics?
Relevant answer
Answer
Dear Prof. Alwielland Q. Bello, adding a few sentences after the previous right answer, uncertainty relations are a powerful tool in the elastic scattering analysis in non-relativistic quantum mechanics in cases where the difficult task of a self-consistent field prevails.
To show a concrete example, low-energy excitations in unconventional superconductors have an imaginary term, where this imaginary part involves an independent scattering lifetime due to dressed fermion quasiparticles exposed to an atomic potential (due to impurities or stoichiometric because both cases produce disorder).
This represents a certain type of non-magnetic disorder in compounds such as strontium ruthenate or High Tc superconductors with strontium.
The use of the uncertainty relation in this case allows a simple and elegant evaluation of the non-equilibrium kinetic coefficients at very low frequencies in the so-called universal limit.
Please check the following work and references therein to see who was the first researcher to propose this elegant technique.
Equations 10, 11 12, 13, and 14 serve as an example.
Best Regards.
  • asked a question related to Uncertainty
Question
30 answers
Paradox 1 - The Laws of Physics Invalidate Themselves, When They Enter the Singularity Controlled by Themselves.
Paradox 2 - The Collapse of Matter Caused by the Law of Gravity Will Eventually Destroy the Law of Gravity.
The laws of physics dominate the structure and behavior of matter. Different levels of material structure correspond to different laws of physics. According to reductionism, when we require the structure of matter to be reduced, the corresponding laws of physics are also reduced. Different levels of physical laws correspond to different physical equations, many of which have singularities. Higher level equations may enter singularities when forced by strong external conditions, pressure, temperature, etc., resulting in phase transitions such as lattice and magnetic properties being destroyed. Essentially the higher level physics equations have failed and entered the lower level physics equations. Obviously there should exist a lowest level physics equation which cannot be reduced further, it would be the last line of defense after all the higher level equations have failed and it is not allowed to enter the singularity. This equation is the ultimate equation. The equation corresponding to the Hawking-Penrose spacetime singularity [1] should be such an equation.
We can think of the physical equations as a description of a dynamical system because they are all direct or indirect expressions of energy-momentum quantities, and we have no evidence that it is possible to completely detach any physical parameter, macroscopic or microscopic, from the Lagrangian and Hamiltonian.
Gravitational collapse causes black holes, which have singularities [2]. What characterizes a singularity? Any finite parameter before entering a spacetime singularity becomes infinite after entering the singularity. Information becomes infinite, energy-momentum becomes infinite, but all material properties disappears completely. A dynamical equation, transitioning from finite to infinite, is impossible because there is no infinite source of dynamics, and also the Uncertainty Principle would prevent this singularity from being achieved*. Therefore, while there must be a singularity according to the Singularity Principle, this singularity must be inaccessible, or will not enter. Before entering this singularity, a sufficiently long period of time must have elapsed, waiting for the conditions that would destroy it, such as the collision of two black holes.
Most of these singularities, however, can usually be resolved by pointing out that the equations are missing some factor, or noting the physical impossibility of ever reaching the singularity point. In other words, they are probably not 'real'.” [3] We believe this statement is correct. Nature will not destroy by itself the causality it has established.
-----------------------------------------------
Notes
* According to the uncertainty principle, finite energy and momentum cannot be concentrated at a single point in space-time.
-----------------------------------------------
References
[1] Hawking, S. (1966). "Singularities and the geometry of spacetime." The European Physical Journal H 39(4): 413-503.
[2] Hawking, S. W. and R. Penrose (1970). "The singularities of gravitational collapse and cosmology." Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences 314(1519): 529-548.
==================================================
补充 2023-1-14
Structural Logic Paradox
Russell once wrote a letter to Ludwig Wittgenstein while visiting China (1920 - 1921) in which he said "I am living in a Chinese house built around a courtyard *......" [1]. The phrase would probably mean to the West, "I live in a house built around the back of a yard." Russell was a logician, but there is clearly a logical problem with this expression, since the yard is determined by the house built, not vice versa. The same expression is reflected in a very famous poem "A Moonlit Night On The Spring River" from the Tang Dynasty (618BC - 907BC) in China. One of the lines is: "We do not know tonight for whom she sheds her ray, But hear the river say to its water adieu." The problem here is that the river exists because of the water, and without the water there would be no river. Therefore, there would be no logic of the river saying goodbye to its water. There are, I believe, many more examples of this kind, and perhaps we can reduce these problems to a structural logic pradox †.
Ignoring the above logical problems will not have any effect on literature, but it should become a serious issue in physics. The biggest obstacle in current physics is that we do not know the structure of elementary particles and black holes. Renormalization is an effective technique, but offers an alternative result that masks the internal structure and can only be considered a stopgap tool. Hawking and Penrose proved the Singularity Theorem, but no clear view has been developed on how to treat singularities. It seems to us that this scenario is the same problem as the structural logic described above. Without black holes (and perhaps elementary particles) there would be no singularities, and (virtual) singularities accompany black holes. Since there is a black hole and there is a singularity, how does a black hole not collapse today because of a singularity, will collapse tomorrow because of the same singularity? Do yards make houses disappear? Does a river make water disappear? This is the realistic explanation of the "paradox" in the subtitle of this question. The laws of physics do not destroy themselves.
-------------------------------------------------
Notes
* One of the typical architectural patterns in Beijing, China, is the "quadrangle", which is usually a square open space with houses built along the perimeter, and when the houses are built, a courtyard is formed in the center. Thus, before the houses were built, it was the field, not the courtyard. The yard must have been formed after the house was built, even though that center open space did not substantially change before or after the building, but the concept changed.
† I hope some logician or philosopher will point out the impropriety.
-------------------------------------------------
References
[1] Monk, R. (1990). Ludwig Wittgenstein: the duty of genius. London: J. Cape. Morgan, G. (Chinese version @2011)
Relevant answer
Answer
Agree. It is a math problem not a real problem in the universe. Anything infinity destroys all conservations in the universe. The center of a black hole should be totally hollow instead of a singularity because of angular momentum have zero probability to be zero. When any matter has angular momentum, it cannot settle still in a point.
  • asked a question related to Uncertainty
Question
1 answer
In discrete systems, such as Markov chains, Shannon entropy can be used to explain the uncertainty and complexity of the system. In continuous systems, such as pure jump Markov processes, does the corresponding differential entropy have a clear physical meaning? If so, how can differential entropy be used to interpret continuous systems?
Relevant answer
Answer
Yes, differential entropy measures the uncertainty or randomness in a continuous probability distribution. It has physical meanings in:
  1. Information Theory: Quantifies average uncertainty or information per unit of the variable.
  2. Thermodynamics: Related to thermodynamic entropy, which measures disorder in physical systems.
  3. Signal Processing: Indicates the bandwidth or information content of signals.
  4. Statistical Mechanics: Connects to the volume of phase space occupied by a system.
Yes, differential entropy in continuous systems, such as pure jump Markov processes, does have physical and interpretative significance, though it differs from Shannon entropy in discrete systems. Here’s how differential entropy can be used to interpret continuous systems:
  • asked a question related to Uncertainty
Question
3 answers
Forecasting inherently involves uncertainty, which arises from various sources such as model limitations, data inaccuracies, and unpredictable environmental factors. My question, "Can we forecast uncertainty in predictions?" seeks to explore whether it is possible to quantify and anticipate the degree of uncertainty within a forecast.
Relevant answer
Answer
In my view, the question may have a two-pronged response. It can be about a time series forecast (I tend to call a forecast) or a regression based forecast (I casually call a prediction). The calculation of confidence limits of a predicted value in regression is widely available and is based on the variance of residuals - the differences between the observed values and the predicted values. This can be considered, using your terminology, as "uncertainty of the predicted value". Usually, we do not predict outside the limits of our independent variable. But with time series, this is a little more complicated. Our main purpose is to forecast outside the time period for which we have values. As we forecast values further and further away from the known time period, our confidence in the forecast further reduces, i.e. the uncertainty grows. To my knowledge, there is no single method for such calculations. In practice, I could not muster much confidence in such confidence intervals. A crude way to calculate such an interval is based on the standard deviation of the residuals for the known period. Then this can be adjusted by the forecast horizon. To directly answer your question as a pragmatist, yes, there are tools to "forecast the uncertainty in predictions", in my experience, with some 'uncertainty'! Hope this helps.
  • asked a question related to Uncertainty
Question
5 answers
We have learned in QM the famous U. Principle which is probably the most important thing in this branch.
We also have learned that space-time stays together in GR.
The problem of measurements in QM comes from U. Principle & vice-versa and why it is not present in GR, not in the same form but analog?
Thanks
Relevant answer
Dear Florian Millo
You may care to read my just published article 'How Come the Quantum? A Deeper Principle Behind Quantization', in IJQF (International Journal of Quantum Foundations), also uploaded onto Research Gate. In this article I (arguably) derive quantization from a deeper underlying principle. David Bohm wrote that the "transfer of a quantum is one of the basic events in the universe and cannot be described in terms of other processes". I've described it in terms of other processes. It turns out that the proposed answer (or rather, the theory put forward constituting the answer) in turn suggests answers to many of the interpretative problems of quantum mechanics, including the measurement problem.
Regards,
Mark Kristian van der Pals
  • asked a question related to Uncertainty
Question
6 answers
I am wondering if anyone has come across calibration curve and uncertainty for PM2.5, SO2, O3, and NO2 monitors maintained and operated by the CPCB?
Please share link herein for the same if you are aware.
Best regards,
Prashant
Relevant answer
Answer
To access documents containing calibration curves and uncertainty data for PM2.5, SO2, O3, and NO2 monitors as regulated by the Central Pollution Control Board (CPCB) of India, you can follow these steps:
1. CPCB Official Website
The Central Pollution Control Board's official website often contains guidelines, reports, and technical documents related to air quality monitoring. Here's how you can navigate their site:
  • Visit the CPCB Website: Go to CPCB's official website.
  • Search for Relevant Documents: Use the search bar or navigate through sections like 'Publications', 'Reports', and 'Guidelines'.
  • Technical Reports and Guidelines: Look for technical reports and guidelines that might contain calibration curves and uncertainty information. These documents are typically found under the 'Air Quality' section.
2. Contact CPCB Directly
If you cannot find the specific documents on their website, you can contact CPCB directly:
  • Email: Send an email to the CPCB (you can find contact details on their website under 'Contact Us').
  • Phone: Call their office for direct assistance.
  • Request for Information: Formally request the specific documents you need, mentioning calibration curves and uncertainties for PM2.5, SO2, O3, and NO2 monitors.
3. Research Publications and Journals
Many calibration and uncertainty studies are published in scientific journals. You can search for these publications through academic databases such as:
  • Google Scholar: Use search terms like "CPCB calibration curve PM2.5", "CPCB SO2 monitor uncertainty", etc.
  • ResearchGate: Access papers and publications from researchers who might have worked on relevant studies.
  • PubMed and IEEE Xplore: These databases also contain environmental and engineering research articles.
4. National Air Quality Monitoring Program (NAMP) Reports
The NAMP, under the CPCB, releases periodic reports which might include detailed calibration and uncertainty data for air quality monitors:
  • NAMP Reports: These reports can be found on the CPCB website under 'Air Quality Monitoring' or similar sections.
  • Annual and Quarterly Reports: Look for specific sections that discuss the methodologies used for calibration and data quality.
5. Environmental Standards and Guidelines
Refer to the environmental standards and guidelines set by CPCB, which might include calibration protocols:
  • Environmental Standards: These documents are typically listed under the 'Standards' section of the CPCB website.
  • Technical Manuals: Manuals and guidelines for monitoring equipment often contain calibration and uncertainty details.
Example Searches on CPCB Website
  1. CPCB Calibration Curve PM2.5: Use this phrase to search for specific calibration details.
  2. Uncertainty Data for SO2 Monitor CPCB: This can help locate documents focusing on uncertainty measurements.
Specific Documents and Sections
  • Air Quality Monitoring Guidelines: These guidelines often contain detailed methodologies.
  • Technical Reports on Air Quality: Look for technical documentation specific to each pollutant (PM2.5, SO2, O3, NO2).
By exploring these resources and contacting CPCB directly if necessary, you should be able to access the required calibration curves and uncertainty data for the specified air quality monitors.
  • asked a question related to Uncertainty
Question
2 answers
I'm looking for a way to measure the uncertainty (standard deviation) on the quantification (area) of the components of an XPS spectrum using CasaXPS.
I found these options in the software, but they don't satisfy me:
1) From the "Quantify (F7)" window, in the "Regions" tab, clicking on "Calculate Error Bars" but it is independent of the fit and changes with each click.
2) From the "Quantify (F7)" window, in the "Components" tab, by clicking on "Monte Carlo", I obtain only the relative value and not the absolute one. But above all, the values do not follow the goodness of the fit: even with components that are not fitted and clearly incorrect, the value is low.
As I have not found these methods to be reliable, my idea is to use the RMS as an estimation of the error on the sum of the areas of the components and then obtain the percentage error of the individual components.
My scope is to provide the composition of my sample, and also the percentages of the various components, for each element, all with their measurement error.
Does anyone know if there is a more automatic and reliable method?
Relevant answer
Answer
Please watch the videos on this on the CasaXPS YouTube channel. There are several which will aid you in this
  • asked a question related to Uncertainty
Question
2 answers
Relevant answer
Answer
The central idea in Jewish philosophy is that God is a singular, indivisible entity beyond human comprehension, distinct from any creation or being. Understanding God as each being's individualized higher self might not be entirely aligned with this. However, Jewish mysticism does talk about the concept of a Divine spark within every living being, indicating a connection and inherent sacredness. Thus, one could think of seeking alignment with their 'higher self' as trying to live in accordance with God's laws and the spark of divine within them. It's important to note that interpretations can vary widely, and other religions or spiritual traditions might have different understandings of the relationship between God and the self.
  • asked a question related to Uncertainty
Question
3 answers
What is the approach to determine uncertainty in energy and inflation time vs. inflation pressure of tyre control unit?
How do you measure the uncertainty of equipment adopted in the Regression Models developed to measure the energy and inflation time of a tyre pressure control Unit?
How do I ascertain uncertainty measurement?
Energy vs inflation pressure for three radii of the tyre
the
Relevant answer
Answer
To determine uncertainty in energy and inflation time versus inflation pressure control in a tyre control unit, the approach typically involves:
  1. Experimental Design: Conduct controlled experiments or tests where you systematically vary energy input and inflation time while measuring inflation pressure using the tyre control unit.
  2. Data Collection: Collect data on energy consumption (input), inflation time, and resulting inflation pressure for each experiment.
  3. Statistical Analysis: Use statistical methods such as regression analysis or analysis of variance (ANOVA) to quantify the relationship
  • asked a question related to Uncertainty
Question
29 answers
In particle accelerators, various subatomic particles are collided at exact points with predetermined momentum. The collision must occur at an exact point; otherwise, detectors cannot register any resulting collision products.
In this case, both the position and momentum of the particles are almost precisely known. This raises questions about the practicality and credibility of the Heisenberg Uncertainty Principle (HUP). Your informed comments would be highly appreciated.
For more critical analysis of the HUP, please see:
Relevant answer
Answer
I do not want to divert the discussion but this is what you said.
  • "I did not say that Heisenberg's considerations are "baseless"."
  • "you should not bother too much about Heisenberg's paper."
  • "The HUP is "correct" as it has a clear and correct mathematical content. The wordings are not that important after all. The "concept" of virtual particles was introduced (as far as I know) as a way to speak of correction terms in QED. Maybe one should not overstretch it, since even the concept of a "real" particle is not unquestionable and self-evident..."
We are told that Heisenberg's initial idea was not well thought out, but its replacement is not based on any experiment or clear theory. Instead, it is a mathematically workable wave function created to fully comply with Heisenberg's initial idea, which we believe should not be taken too seriously. This is why my first question concerns the theoretical or experimental basis of the HUP.
I apologize if you still think my interpretation of your comments is not correct.
  • asked a question related to Uncertainty
Question
23 answers
The origin of Heisenberg's uncertainty principle can be better understood through the lens of complex vector spaces. In my paper " ," I explore how representing complementary variables as complex numbers provides a deeper insight into quantum mechanics.
  1. Position and Momentum: By representing position (x) and momentum (p) as complex variables, the uncertainty principle is expressed as the product of their uncertainties having a lower bound related to Planck's constant. This formulation highlights the intrinsic uncertainties and the probabilistic nature of measurements in quantum mechanics.
  2. Energy and Time: Similarly, energy and time uncertainties are expressed in complex terms, showing the internal vibrations of particles and their states in a complex vector space. This provides a more comprehensive understanding of quantum uncertainties.
  3. Physical Origin of Uncertainty: The physical origin of Heisenberg's uncertainty principle is attributed to the vibrations and interactions of particles in the complex plane. This complex representation provides insight into why there is a lower limit to the precision with which complementary variables can be measured simultaneously.
These points illustrate how the use of complex numbers in quantum mechanics aligns with the holographic principle ( ) and offers a unified framework for understanding quantum phenomena. For a detailed exploration of these ideas, you can refer to my paper available on ResearchGate.
Relevant answer
Answer
A person thinks simply and quickly realizes that the laws of nature occur in the same form in all dimensions... The fact that everything looks different is precisely due to the difference in the large dimension... The way we try to relate from our level.
It leads to the very wrong conclusion that the comparison from our dimension was done incorrectly. Heisenberg's Uncertainty Principle was excellent for eliminating this.
Regards,
Laszlo
  • asked a question related to Uncertainty
Question
7 answers
I am exploring the relationship between the uncertainty in the coordinates of the center of mass of a rigid body and the uncertainty of the corresponding elements of the inertia tensor. Any idea that would be applicable in a more or less general way? Of course, a Monte Carlo simulation would be useful and i am planning to do it, but I am thinking more of an analytical relationship. My idea is not having to perform the calculations starting from the mass distribution but from the center of mass position (i.e: given a center of mass displacement from the most expected value, the the inertia tensor changes in this or that way).
Relevant answer
Answer
The inertia tensor is a function of several geometric and material parameters. If they are assumed as random variables following a certain PDF, it should be possible - in principle - to compute the PDF of the inertia tensor in closed form by applying transformation formulas of random variables
  • asked a question related to Uncertainty
Question
1 answer
For Example Climate Sensitivity
Climate sensitivity refers to the change in radiative forcing, or surface air temperature, resulting from a doubling of atmospheric CO2 concentration. Estimates of climate sensitivity range from about 2°C to 5°C per doubling of CO2. This range exists due to uncertainties in atmospheric physics, particularly feedback mechanisms between various atmospheric states and surface air temperature or CO2 concentration. For instance, the impact of changes in radiative forcing on cloud processes is not well understood, leaving uncertainty about whether clouds will mitigate or amplify warming.
Below is a list of the main feedbacks in the climate system:
  • Water vapor feedback
  • Lapse rate feedback
  • Ice albedo feedback
  • Carbon cycle feedback
Climate scientists dedicate their careers to refining our understanding of these feedbacks. They use paleoclimate data, model experiments, and modern measurements to address these uncertainties. However, uncertainties remain in each feedback, and when combined, they create a complex, high-dimensional problem. Depending on the true magnitude of these feedbacks, climate sensitivity could range from 2°C to 5°C.
I need other aspects and related most relevant questions arising among different communities?
Relevant answer
Answer
What occurs first, carbon concentration, or temperature? Temperature rise leads CO2 concentration change, indicating temperature appears to be the driver.
  • asked a question related to Uncertainty
Question
2 answers
Hydrologic studies on different basins, while focused on distinct geographic areas, share several key similarities in terms of methodology, objectives, and challenges. These similarities can be categorized as follows:
1. Core Objectives
2-Data Collection and Analysis
3- Methodological Approaches
4 -Challenges and Uncertainties
5- Integrated Management Approaches
6- Applications and Outcomes
By focusing on these common aspects, hydrologic studies can effectively address the unique characteristics and challenges of different basins while leveraging shared methodologies and goals to advance the understanding and management of water resources globally.
What is your opinion ?
Relevant answer
Answer
Hydrologic studies encompass a wide range of investigations aimed at understanding the movement, distribution, and quality of water in the environment. Common aspects of hydrologic studies typically include the following:
1. Precipitation Analysis
  • Measurement: Collecting data on the amount, intensity, and distribution of rainfall and snowfall.
  • Analysis: Understanding precipitation patterns and trends, often using meteorological data, satellite imagery, and rain gauges.
2. Surface Water Hydrology
  • Streamflow Measurement: Gauging the flow of rivers and streams using flow meters, weirs, and flumes.
  • Hydrograph Analysis: Examining the relationship between precipitation and river discharge over time.
  • Watershed Delineation: Defining the geographical boundaries of watersheds and studying their characteristics.
3. Groundwater Hydrology
  • Aquifer Characterization: Assessing the properties and extents of aquifers through drilling, well logging, and geophysical surveys.
  • Groundwater Flow: Modeling the movement of groundwater using Darcy’s law and other hydrogeological principles.
  • Recharge and Discharge: Evaluating the rates and sources of groundwater recharge and discharge.
4. Water Quality Assessment
  • Chemical Analysis: Testing for contaminants, nutrients, and pollutants in water samples.
  • Biological Assessment: Evaluating the presence of microorganisms, algae, and other biological indicators of water quality.
  • Physical Properties: Measuring parameters like temperature, turbidity, and sediment load.
5. Hydrologic Modeling
  • Simulation Models: Developing and using models to simulate hydrological processes and predict future water conditions (e.g., SWAT, HEC-HMS).
  • Calibration and Validation: Adjusting models to fit observed data and validating their accuracy.
6. Flood and Drought Analysis
  • Flood Risk Assessment: Identifying flood-prone areas, estimating flood frequencies, and modeling flood inundation scenarios.
  • Drought Monitoring: Tracking drought conditions using indices like the Standardized Precipitation Index (SPI) and assessing their impacts on water resources.
7. Water Balance Studies
  • Input-Output Analysis: Quantifying the inputs (precipitation, inflow) and outputs (evaporation, outflow) of a hydrological system to determine changes in water storage.
  • Evapotranspiration Measurement: Estimating the amount of water lost to the atmosphere through evaporation and plant transpiration.
8. Hydraulic Studies
  • Flow Dynamics: Studying the movement and distribution of water in channels, pipes, and other conduits.
  • Sediment Transport: Investigating the processes of erosion, transport, and deposition of sediments in water bodies.
9. Climate Change Impact Studies
  • Trend Analysis: Assessing long-term changes in hydrological patterns due to climate change.
  • Scenario Modeling: Projecting future water conditions under various climate change scenarios.
10. Human Impact Studies
  • Land Use Changes: Examining the effects of urbanization, deforestation, and agriculture on hydrological processes.
  • Water Management Practices: Evaluating the impact of water extraction, dam construction, and irrigation on water availability and quality.
11. Integrated Water Resources Management (IWRM)
  • Stakeholder Involvement: Engaging communities, policymakers, and scientists in managing water resources.
  • Sustainable Practices: Developing and promoting practices that ensure the long-term sustainability of water resources.
These aspects provide a comprehensive understanding of water resources, which is crucial for effective management and decision-making in various sectors such as agriculture, urban planning, environmental conservation, and disaster management.
  • asked a question related to Uncertainty
Question
2 answers
How and to what extent can Total Cost Management evolve towards Systemic Value Management (SVM)? In this regard, what are the “Validating Principles and Actions for a Global Practitioner Movement”?
The persistence of structural crises and the widespread uncertainty and volatility of any forecast on times, costs and resources for complex projects have nowadays become a challenge for managerial disciplines and for Total Cost Management in particular. The value, policy and social aspects seem to be taking on ever greater importance for each project, transforming it into a very complex project. The AICE – The Italian Association for Total Cost Management is engaged in research in this regard (see for example https://www.mdpi.com/2071-1050/14/19/12890 ) and asks experts in these disciplines to participate in an ongoing survey in this regard.
Can you make your contribution by participating in this survey and expressing your point of view? You will find every detail at the following link https://forms.gle/PEpZ8YTn8yTdSg5z6
Relevant answer
Answer
Dmytro Chashyn Thank you Dmytro for your suggestion. We will keep record of your comment in our analysis.
  • asked a question related to Uncertainty
Question
1 answer
In my thesis on reverse logistics, I explore the use of the ANP-TOPSIS hybrid method in handling uncertainty. I'm interested in hearing from researchers about the effectiveness of 'Rough Set,' 'Neutrosophic,' or 'Fuzzy' theories in addressing uncertainty within supplier selection processes. Any insights on these approaches would be greatly appreciated!
Relevant answer
Answer
Some artices on Fuzzy Rough Set , Rough Fuzzy Set can be followed before adding the concept of Neutrosophic set.
  • asked a question related to Uncertainty
Question
3 answers
In the context of machine learning models for healthcare that predominantly handle discrete data and require high interpretability and simplicity, which approach offers more advantages:
Rough Set Theory or Neutrosophic Logic?
I invite experts to share their insights or experiences regarding the effectiveness, challenges, and suitability of these methodologies in managing uncertainties within health applications.
Relevant answer
Answer
I appreciate the resources shared by R.Eugene Veniaminovich Lutsenko.
However, these references seem to focus on a different aspect of healthcare modeling. I'm still interested in gathering insights specifically about the suitability of Rough Set Theory and Neutrosophic Logic for handling discrete data in machine learning healthcare models.
Please feel free to contribute to this discussion if you have expertise in this area. Thank you
  • asked a question related to Uncertainty
Question
1 answer
I encountered challenges while conducting GSEM due to uncertainties in running the analysis with solely observed variables. This was particularly prominent when dealing with categorical variables such as x, y, and z, where all of them served as dependent variables.
Relevant answer
Answer
Running a Generalized Structural Equation Model (GSEM) with only observed variables can indeed present some challenges, as GSEM typically involves both observed and latent variables. However, it's still possible to conduct analyses using observed variables only. Here's how you can approach it:
  1. Define Your Research Questions: Clearly articulate your research questions and hypotheses. Determine what relationships you want to explore among your observed variables.
  2. Choose an Appropriate Analysis Method: With only observed variables, you may opt for simpler statistical techniques such as multiple regression analysis, analysis of variance (ANOVA), or logistic regression, depending on the nature of your data and research questions. These methods allow you to assess the relationships between independent and dependent variables directly.
  3. Consider Mediation and Moderation: While traditional regression analysis can assess direct relationships between variables, you can also explore mediation and moderation effects if your research questions involve understanding the underlying mechanisms or conditional relationships between variables.
  4. Check Assumptions: Ensure that the assumptions of the chosen analysis method are met. This may include checking for normality, linearity, homoscedasticity, and independence of errors.
  5. Interpret Results: Interpret the results of your analysis in line with your research questions and hypotheses. Consider the strengths and limitations of using observed variables only and discuss any implications for your findings.
  6. Explore Alternative Models: If your initial analysis does not fully address your research questions, consider alternative modeling approaches or data collection strategies that may allow for a more comprehensive analysis in the future.
While GSEM is a powerful tool for analyzing complex relationships among observed and latent variables, conducting analyses with only observed variables is a common practice in many research fields. By carefully framing your research questions and selecting appropriate analysis methods, you can still gain valuable insights from your data.
  • asked a question related to Uncertainty
Question
1 answer
I've fitted a latent growth mixture model to time series data. It consists of a value (population prevalence) at 11 time points for a sample of 150 areas. Said model was fitted using the lcmm package in R and identified a two class model as optimal - reflecting an hlme model as follows:
gmm2 <- gridsearch(rep = 1500, maxiter = 50, minit = gmm1, hlme(Value ~ jrtime, subject = "ID", random=~jrtime, ng = 2, data = rec, mixture = ~ jrtime, nwg=T))
Said model uses the "Value"/prevalence data for each area as the primary variable. However, the original "Value" column within the data also relates to 95% confidence intervals (reflecting different sample sizes which contributed to the observations at each time point and for each area). Under the current approach all observations of "Value" are treated equally. Should I (and is there a good method through which to) account for the differing levels of uncertainty in "Value" as part of my lcmm?
I wondered if such could be coded into the package, but this does not appear to be the case. Therefore I wondered whether a manual account could be taken (such as adding the CI range as a covariate)? However, I also wondered whether there was scope to add such as part of the prior setting if applying an alternative Bayesian approach.
Any advice/links to relevant literature (or shareable code) would be hugely appreciated.
Relevant answer
Answer
Yes, it is important to account for uncertainty Latent Class Growth Mixture Models (LCGMM), these models are used to identify homogeneous subpopulations within a heterogeneous population, and it also help to identify meaningful groups or classes of individuals.
  • asked a question related to Uncertainty
Question
4 answers
The three spatial dimensions (x, y and z) of spacetime can be physically demonstrated. However, what about the time dimension? Mathematically, time is a fundamental part of spacetime.
ds2 = c2dt2 – dx2– dy2 – dz2
However, physically where does the time dimension reside? The physical constants (c, G and ħ) all incorporate time. This is not just time in the abstract. Gravity and relative motion affect the local rate of time. On an absolute scale, these constants (c, G and ħ) require coordination with the local rate of time. How do you visualize the time component of spacetime?
This is a discussion question, so I will give my answer to start the discussion. John Archibald Wheeler proposed the uncertainty principle and vacuum zero-point-energy requires that on the scale of Planck length, spacetime must be oscillating at Planck frequency. He designated this “quantum foam”. This oscillation (internal clock) would give every volume of spacetime its intrinsic time dimension. This oscillating spacetime model of the universe ultimately generates a fundamental particle’s gravity and electrical charge.
Relevant answer
Answer
The term "classical physics" can imply either physics without relativity and wave properties or merely conceptually understandable physics. The model of the universe I am proposing is based entirely on quantized waves (no point excitations of fields). This wave-based model of the universe is classical in the sense that it is conceptually understandable. For example, it achieves the properties of special relativity (length contraction, time dilation, Lorentz transformations) from the wave-based sonic universe model. The electromagnetic and gravitational forces are also obtained from this wave-based model.
Perhaps most important, this model is supported by the correct predictions it makes. In the proposed model, oscillating spacetime is a sonic medium that generates everything in the universe. If this is correct, then everything, including the forces, are predicted to be united. For example, this model predicts that the gravitational and electrostatic forces between particles become equal at the Planck limit. The gravitational force between two Planck mass particles is predicted to equal the electrostatic force magnitude between two particles with Planck charge. Amazingly, this prediction is correct. Both force magnitudes equal ħc/r2 at arbitrary separation distance r. This and other predictions emerge from the wave-based model of the universe.
  • asked a question related to Uncertainty
Question
23 answers
Can Physical Constants Which Are Obtained with Combinations of Fundamental Physical Constants Have a More Fundamental Nature?
Planck Scales (Planck's 'units of measurement') are different combinations of the three physical constants h, c, G, Planck Scales=f(c,h,G):
Planck Time: tp=√ℏG/c^5=5.31x10^-44s ......(1)
Planck Length: Lp=√ℏG/c^3=1.62x10^-35m ......(2)
Planck Mass: Mp=√ℏc/G=2.18x10^-8 kg ......(3)
“These quantities will retain their natural meaning for as long as the laws of gravity, the propagation of light in vacuum and the two principles of the theory of heat hold, and, even if measured by different intelligences and using different methods, must always remain the same.”[1] And because of the possible relation between Mp and the radius of the Schwarzschild black hole, the possible generalized uncertainty principle [2], makes them a dependent basis for new physics [3]. But what exactly is their natural meaning?
However, the physical constants, the speed of light, c, the Planck constant, h, and the gravitational constant, G, are clear, fundamental, and invariant.
c: bounds the relationship between Space and Time, with c = ΔL/ Δt, and Lorentz invariance [4];
h: bounds the relationship between Energy and Momentum with h=E/ν = Pλ, and energy-momentum conservation [5][6];
G: bounds the relationship between Space-Time and Energy-Momentum, with the Einstein field equation c^4* Gμν = (8πG) * Tμν, and general covariance [7].
The physical constants c, h, G already determine all fundamental physical phenomena‡. So, can the Planck Scales obtained by combining them be even more fundamental than they are? Could it be that the essence of physics is (c, h, G) = f(tp, Lp, Mp)? rather than equations (1), (2), (3). From what physical fact, or what physical imagination, are we supposed to get this notion? Never seeing such an argument, we just take it and use it, and still recognize c,h,G fundamentality. Obviously, Planck Scales are not fundamental physical constants, they can only be regarded as a kind of 'units of measurement'.
So are they a kind of parameter? According to Eqs. (1)(2)(3), c,h,G can be directly replaced by c,h,G and the substitution expression loses its meaning.
So are they a principle? Then who are they expressing? What kind of behavioral pattern is expressed? The theory of quantum gravity takes this as a " baseline ", only in the order sense, not in the exact numerical value.
Thus, Planck time, length, mass, determined entirely by h, c, G, do they really have unquestionable physical significance?
-----------------------------------------
Notes
‡ Please ignore for the moment the phenomena within the nucleus of the atom, eventually we will understand that they are still determined by these three constants.
-----------------------------------------
References
[1] Robotti, N. and M. Badino (2001). "Max Planck and the 'Constants of Nature'." Annals of Science 58(2): 137-162.
[2] Maggiore, M. (1993). A generalized uncertainty principle in quantum gravity. Physics Letters B, 304(1), 65-69. https://doi.org/https://doi.org/10.1016/0370-2693(93)91401-8
[3] Kiefer, C. (2006). Quantum gravity: general introduction and recent developments. Annalen der Physik, 518(1-2), 129-148.
[4] Einstein, A. (1905). On the electrodynamics of moving bodies. Annalen der Physik, 17(10), 891-921.
[5] Planck, M. (1900). The theory of heat radiation (1914 (Translation) ed., Vol. 144).
[6] Einstein, A. (1917). Physikalisehe Zeitschrift, xviii, p.121
[7] Petruzziello, L. (2020). A dissertation on General Covariance and its application in particle physics. Journal of Physics: Conference Series,
Relevant answer
Answer
The Planck scales, including Planck length, Planck time, Planck mass, Planck temperature, and Planck charge, are a set of physical constants that define scales at which quantum gravitational effects become significant, effectively marking the limits of our current understanding of the universe. These scales arise from fundamental physical constants: the speed of light in a vacuum (c), the gravitational constant (G), and the reduced Planck constant (ħ).
and yes Gravity is a fundamental constant as far as our observations and experiments.
Constants:
In one sense, Planck scales can be considered constants because they are defined through a combination of other fundamental physical constants that do not change. They represent the scales at which gravitational interactions become as strong as quantum effects, leading to a regime where our current theories of physics—quantum mechanics and general relativity—no longer independently suffice.
Parameters:
Planck scales could also be seen as parameters within the broader context of theoretical physics and cosmology. They parameterize the scales at which new physics—potentially including quantum gravity, string theory, or other unified theories—must be invoked to accurately describe phenomena. In theoretical models extending beyond the Standard Model and General Relativity, the exact implications of these scales and their relevance can vary, making them parameters that guide our exploration of the universe at its most fundamental level.
Principles:
Viewing Planck scales as principles is a more abstract approach but equally valid. They embody the principle that there is a fundamental scale of distance, time, mass, and energy beyond which the classical descriptions of space-time and matter cease to apply and a more fundamental theory is required. This perspective invites reflection on the limits of our current theories and the principles that any future theory of quantum gravity must satisfy to seamlessly bridge the gap between quantum mechanics and general relativity.
In summary, Planck scales can be interpreted as constants, parameters, or principles depending on the context of the discussion and the framework within which they are being considered. As constants, they are fixed values derived from fundamental constants of nature. As parameters, they guide theoretical and experimental research into the realms of high energy physics and quantum gravity. As principles, they represent conceptual boundaries that challenge and inspire the development of new physics.
  • asked a question related to Uncertainty
Question
4 answers
What is the TRMM satellite precipitation program? And how can it help humans?
as you know :
GPM can provide worldwide rain and snow data at any time
Using microwave and infrared technology. The TRMM sensor package has been expanded with GPM, which improves the ability to observe precipitation. GPM nuclear observatory to two-frequency radar i.e. Ku and Ka bands compared to TRMM four-channel high-frequency satellites from
As a result, the microwave radiometer increases the observability for light and solid precipitation. As a result, the GPM mission can provide more. These monthly in situ gauge data will be used in the final implementation. . This GPM satellite provides very accurate and detailed, for example, GPM rainfall measurements across India. GPM satellite data enables the researcher to study various hydrological applications such as climate research, drought monitoring, flood forecasting, agricultural planning. Etc . Uncertainty in satellite precipitation data caused by several factors including spatial and
study time scales; It has reported some key factors such as instrumental uncertainty, sampling uncertainty, recovery. Algorithm uncertainty, regional and topographic effects and side data are necessary to pay attention to.
Relevant answer
Answer
Dear James Garry, dear doctor, thank you, Dr. James Garry
  • asked a question related to Uncertainty
Question
5 answers
I keen for sharing your knowledge and experience as i am working in my thesis about drawing road maps for the new shipyards to have a systemic step for having lean in its manufacture process. But here i am talking about Shiprepair or Rig Repair not the new building where the different is huge with respect of uncertainty scope and short time frame.
Relevant answer
Answer
ResearchGate Link:
(PDF) Improving Lean engagement through utilising improved communication, recognition and digitalisation during the COVID-19 pandemic in JLR's powertrain machining facility (researchgate.net)
  • asked a question related to Uncertainty
Question
10 answers
Suppose I am making a solution of a particular concentration and I am doing it with step wise dilution from a stock solution. If I use a same volumetric flask for 2 time for the consecutive dilution, then do I have to add uncertainty of that volumetric flask two(2) time or only one(1) time in the calculation of total measurement uncertainty.
Relevant answer
Answer
It is not at all implied from my article (and from my answer to your question either) that you can avoid adding the uncertainty arising from repeat use of the volumetric ware (regardless of using it for delivery or not). I wrote in my comment that because of repeated counting the uncertainty derived from the volumetric tolerance, that is, the tolerance divided by the square root of six (see the first line in Table 2(a) in the paper), the resulting uncertainty will be overestimated. (Overestimated as compared to that uncertainty value that would be obtained if the identity of volumetric ware used at several steps would be taken into account.) Also, I wrote that in practice it is reasonable to disregard this redundancy in volumetric uncertainty evaluation. Thus, answering specifically to your question, you should add (by root-sum-squaring) the uncertainty derived from the tolerance for your volumetric flask and your pipette, respectively, two (2) times.
Concerning the uncertainty evaluation procedure described in the QUAM95 Guide and its subsequent editions, I am very familiar with it, as it was criticized in my shared paper and in the earlier paper attached here. The weak points in that procedure are: 1) the joint use of the specified tolerances and the experimentally estimated variability, while only the first one converted into the standard uncertainty is required in the tolerance-based approach; 2) the use of the value of √3 in the denominator to convert the specified tolerance into the standard uncertainty, assuming a rectangular probability distribution instead of a triangular distribution that is much more realistic in this case. Otherwise, what is described in the Guide is not objectionable. Note that the case of the same glassware used in the dilution chain is not considered there, since “in practice, different pipettes and flasks (of the same nominal capacity) will be used” (QUAM95, p. 37, left column above). Also note that the uncertainty arising from temperature effects should additionally be taken into account.
Best,
Rouvim Kadis
  • asked a question related to Uncertainty
Question
1 answer
Unfortunately, I do not have access to any reliable or relatable databases to complete a Life Cycle Assessment of Bitcoin mining so I have began completing the task manually. I was wondering if anyone had an guidance on calculating uncertainties or sensitivity analysis to ensure I can consider errors in my study.
Any tips regarding this or conducting a life cycle assessment manually will be greatly appreciated!
Relevant answer
Answer
  • asked a question related to Uncertainty
Question
3 answers
We resolve problem of cosmological constant discrepancy.
In order to do this we assume Single Uncertainty Sphere, meaning that only one exists on whole Universe and in that way the energy density just reduced by 122 orders of magnitude and make theoretical values of cosmological constant coherent with observations.
This hypothesis gives much larger impact not only on astrophysics, but also how we should perceive structure how space is constructed.
Feel free to comment and encourage for discussion about this assumption.
Do you think it is too abstract and crazy?
Please refer to this preprint to see more details:
Relevant answer
Answer
Even QM was once seems crazy and radical verging on nonsensical.
Galileo was imprisoned for his …
keep At at it, the universe does owe us logical and easy explanations
  • asked a question related to Uncertainty
Question
4 answers
Suppose L_p is the usual Lebesgue space over (0,1) if you wish. Suppose T_j:L_1-->L_2 defines a sequence of continuous linear operators. Suppose l_1(L_1) is the Banach space of sequences from L_1 with norm (f_j_j-->||f_1||+||f_2||+... . Suppose L_2(l_inf) is the Banach space of sequences (f_j)_j from L_2 with the norm (f_j)_j-->||sup_j|f_j||. Finally, suppose T:l_1(L_1)-->L^2(l_inf) is a linear map defined by
(f_j)_j-->(T_j(f_j))_j.
It seems to me that the fact that T is well-defined, i.e. all outputs are in L_2(l_inf), AND each T_j is continuous implies T is continuous by the closed graph theorem. This is because the candidate limit (f_j)_j when arguing T has a closed graph has to satisfy f_j=T_j(x_j) where (x_j^n)_j converges to (x_j)_j in L_1(l_1).
My uncertainty stems from the following example. Fix T_1 and let T_j=log(j+9)T_1 for j>2. Since this sequence (T_j)_j is not uniformly L_1-->L_2 bounded, the corresponding operator T cannot be bounded(continuous). However, the slow growth of the operator norms is slow enough so that for (f_j)_j in L_1(l_1),
||sup_{j\le N}|T_j(f_j)|||<=||T_1||(sum_j (log(j+9))^2||f_j||^2)^{1/2}.
I'm just estimating by replacing maximal function on left with square function within the L_2 norm. In other words, since (f_j) in L^1(l_1), the right side of the inequality is finite and independent of N. Does this not imply T is well-defined from L_1(l_1) into L_2(l_inf) and thus contradicting the closed graph theorem argument above.
What am I missing? What dumb oversight am I not seeing?
Relevant answer
Answer
Thanks, don't know what I was thinking.
  • asked a question related to Uncertainty
Question
2 answers
How to Reasonably Weight the Uncertainty of Laser Tracker and the Mean Square Error of Level to Obtain Accurate H(Z)-value?
Relevant answer
Answer
Received, thank you for your answer。
  • asked a question related to Uncertainty
Question
1 answer
In an era defined by the digitization of measurement processes and the increasing use of artificial intelligence, do you believe that focusing on the digitization of high-precision measurements through AI approach applications is advantageous? For instance, imagine an AI interpreter for analog instruments using optical vision.
This question arises considering the questionable reliability inherent in AI, based on probabilistic algorithms that can generate precise but not necessarily infallible measurements. Additionally, there is complexity in evaluating uncertainty in automatic measurements, considering environmental factors such as lighting and the quality of the optical viewer that could affect the reliability of results. How can we balance the promise of AI precision with the need for absolute reliability in high-precision metrology, especially concerning traceability to primary standards?"
Relevant answer
Answer
This seems like the last place to use unverified AI conclusions. Here we need conventional, provable fact and derivation, not a cauldron of probabilistic beliefs and correlations.
  • asked a question related to Uncertainty
Question
1 answer
WHAT DOES THE AGNOSTIC STATEMENT, postulated by Dawkins:
„GOD ALMOST CERTAINLY DOES NOT EXIST”,
actually imply?
  • Is this statement in line with logic and reason or represent Dawkins' individual view as to the existence of god to flatter both believers and non-believers?
  • Is this agnostic statement correct, just because it has many followers, among whom there are even well-known scientists?
  • Anyway, is it appropriate for a scientist like Richard Dawkins, nota bene from OXFORD – the ˌworld renowned university, to declare himself a religious agnostic?
Answering this questions based on argumentum ad rem, you"ll come to the right answer.
_________________________________________
* More detailed info in my video lecture on YouTube:
>THE FICTION OF AGNOSTICISM: CANI vs HUXLEY. TIME FOR NEW DEFINITIONS!
> INTERMEDIALISM vs AGNOSTICISM: CANI'S COMPLEX PLANE OF RELIGIOSITY vs DAWKINS’ LINEAR SCALE
Relevant answer
Answer
Well he holds a really good question.
About God's own existence,
God always was, is and will be.
Dawkins questions how is this possible that someone is infiniyely lifefull
  • asked a question related to Uncertainty
Question
4 answers
How do humans handle the anxiety of uncertainty about entering eternal salvation? How? Why?
Relevant answer
Answer
I suppose deathbed conversions or confessions are one way.
  • asked a question related to Uncertainty
Question
1 answer
For example, There is no doubt that global sea level is rising, and based on the global mean sea level (GMSL)data, we can calculated the trend of the GMSL. However, we all know that that must be some interannual/decadal variations of the GMSL, and even the alising errors of our data. We can get the linear trend of GMSL timeseires based on least-square method. However, how can we estimate the uncertainty range of this trend? 1, GMSL timeseires have autocorrelation; 2, the variations of GMSL timeseries are not the white noises, the standard deviation of GMSL anomalies is not 1.
Relevant answer
Answer
I suggest that you employ an ARCH/ARMAX method where the ARCH component models the conditional variance of the error term, the ARMA component models the autoregressive nature of your data, and the X component models the effects of the exogenous variables.
Here is a link to a recent application of the method:
  • asked a question related to Uncertainty
Question
2 answers
Dear colleagues
I just notice a research gate anouncement about the readership of a document I am one of the authors of : « Risk, uncertainty and agriculturel development » . This book is described as having been edited by Frank S. Conklin,, Bruce McCarlJames, A. Roumasset, Jean-Marc Boussard and Inderjit Singh. . Now, it is true that James A. Roumasset, Jean-Marc Boussard and Inderjit Singh are the editors. . But Frank S. Conklin, and Bruce McCarl, have nothiong to do with it (they were even not among ther participants of the conference the book is the proceedings of …). Please correct your files !
Relevant answer
Answer
Please note that you wrote to the ResearchGate community, not to the RG team. - See "A researcher has wrongly claimed co-authorship of my publication" in https://help.researchgate.net/hc/en-us/articles/14292798510993 for instructions how to correct this. But unlike this help page suggests, most probably it were not the other editors themself who wrongly claimed authorship, but ResearchGate's automatic algorithm wrongly identified them as co-editors and assigned this publication also to their profiles. This is a frequent problem in ResearchGate due to algorithms working no perfectly.
  • asked a question related to Uncertainty
Question
1 answer
Hello,
we are doing XRF analysis of PM10 aerosol samples with the PX-375. However, we calibrate the instrument with the UCDavis reference material with PM2.5 elemental loading. Does anybody know if this is fine or how large is the uncertainty? Our PM2.5 to PM10 ratio varies between 0.5 and 0.9.
Thanks for your help.
Ivonne
Relevant answer
Answer
Hi Ivonne,
Yes, you can use these materials to analyze PM10 samples.
Best,
Sinan
  • asked a question related to Uncertainty
Question
6 answers
Hello,
We need some debate about the role of statistics in scientific confirmation.
Thanks
Dr. F CHELLAI
Relevant answer
Answer
"Statistics" is simply how we interpret data results that we received. True there are people who falsify data, or more likely (a question I often received) "Can't you make that chart look any better" (my answer was no).
We can infer social and cultural results. A company I worked for used "Behavior Based Safety" to collect observations of conditions and people doing work to see what risks we had. I've run many employee surveys that gave the leadership team insight into culture. For example, we found the question "Senior management (above my manager) vists my workplace" with options of Strongly Agree, Agree, Neutral, Disagree, and Strongly Disagree to be a bell-weather question. If that question was trending to the negative, we knew to start asking questions. True, some Vice Presidents said "But I always visit the workplace" - but if the workers' perception was that they were no - then that was important.
There is an interesting book out there "How to measure anything".
As a counter, I often in class would ask - "I love my spouse more than you love your spouse / significant other - Now prove or disprove that with data". Of course that is a silly question - but does bring up a good discussion.
  • asked a question related to Uncertainty
Question
1 answer
Hello, I'm seeking clarification on the selection of suitable boundary conditions for simulating shear deformation of a screw dislocation using LAMMPS. In my script, I currently employ the following commands:
```lammps
fix 1 upper setforce 0.0 0.0 0.0
fix 2 lower setforce 0.0 0.0 0.0
fix 3 upper move NULL ${strainrate} NULL
fix 4 lower move NULL-${strainrate} NULL
fix 5 mobile nve
```
I have several uncertainties:
1. Should I fix all three degrees of freedom in fix 1 & 2 for shear deformation in the Y direction, or are specific degrees of freedom recommended?
2. In fix 3 & 4, should I use NULL or 0.0?
3. Should fix 5 be applied to only middle atoms or to all atoms?
Any insights would be greatly appreciated!
Relevant answer
Answer
1. the direction of the dislocation line should be non f boundary condition. I use p in that direction.
2. should be zero not null because null will do a time integration for that direction, which is un-necessary.
3. fix 5 should be on middle atoms.
plus: you are missing one thermal layer (typically, fixed layer, thermal layer and mobile layer) if you use this model.
  • asked a question related to Uncertainty
Question
8 answers
I believe temperature T, pressure P and volume V are all measurable quantities. Entropy S is not measurable. Further, there is an uncertainty relation associated with temperature and energy. That is, temperature can only be measured by bringing a system into contact with another system of known temperature. However, energy can only be measured in an isolated system.
Relevant answer
Answer
Perhaps, one way to meaningfully formalize the question is to start by postulating that temperature T, pressure P and volume V of any system are measurable using available metrology. Then, one possible formalization of the question is if the internal energy can be measured by measuring T, P or V of the original system or by measuring T, P or V of the result of a certain controlled interface (adiabatic mixing? mixing in a fixed volume?) of the original system with another, specially prepared system.
  • asked a question related to Uncertainty
Question
6 answers
Fear, interest rates and uncertainties on rising and markets under pressure, what next??
Relevant answer
Answer
Am in agreement with Renzo Bianchi
  • asked a question related to Uncertainty
Question
26 answers
There are many kinds of certainty in the world, but there is only one kind of uncertainty.
I: We can think of all mathematical arguments as "causal" arguments, where everything behaves deterministically*. Mathematical causality can be divided into two categories**: The first type, structural causality - is determined by static types of relations such as logical, geometrical, algebraic, etc. For example, "∵ A>B, B>C; ∴ A>C"; "∵ radius is R; ∴ perimeter = 2πR"; ∵ x^2=1; ∴ x1=1, x2=√-1; .......The second category, behavioral causality - the process of motion of a system described by differential equations. Such as the wave equation ∂^2/ ∂t^2-a^2Δu=0 ...
II: In the physical world, physics is mathematics, and defined mathematical relationships determine physical causality. Any "physical process" must be a parameter of time and space, which is the essential difference between physical and mathematical causality. Equations such as Coulomb's law F=q1*q2/r^2 cannot be a description of a microscopic interaction process because they do not contain differential terms. Abstracted "forces" are not fundamental quantities describing the interaction. Equations such as the blackbody radiation law and Ohm's law are statistical laws and do not describe microscopic processes.
The objects analyzed by physics, no matter how microscopic†, are definite systems of energy-momentum, are interactions between systems of energy-momentum, and can be analyzed in terms of energy-momentum. The process of maintaining conservation of energy-momentum is equal to the process of maintaining causality.
III: Mathematically a probabilistic event can be any distribution, depending on the mandatory definitions and derivations. However, there can only be one true probabilistic event in physics that exists theoretically, i.e., an equal probability distribution with complete randomness. If unequal probabilities exist, then we need to ask what causes them. This introduces the problem of causality and negates randomness. Bohr said "The probability function obeys an equation of motion as did the co-ordinates in Newtonian mechanics "[1]. So, Weinberg said of the Copenhagen rules, "The real difficulty is that it is also deterministic, or more precisely, that it combines a probabilistic interpretation with deterministic dynamics" [2].
IV: The wave function in quantum mechanics describes a deterministic evolution energy-momentum system [3]. The behavior of the wave function follows the Hamiltonian principle [4] and is strictly an energy-momentum evolution process***. However, the Copenhagen School interpreted the wave function as "probabilistic" nature [23]. Bohr rejected Einstein's insistence on causality by replacing the term "complementarity" with his own invention, "complementarity". Bohr rejects Einstein's insistence on causality, replacing it with his own invention of "complementarity" [5].
Schrödinger ascribed a reality of the same kind that light waves possessed to the waves that he regards as the carriers of atomic processes by using the de Broglie procedure; he attempts "to construct wave packets (wave parcels) that have relatively small dimensions in all directions," and which can obviously represent the moving " and which can obviously represent the moving corpuscle directly [4][6].
Born and Heisenberg believe that an exact representation of processes in space and time is quite impossible and that one must then content oneself with presenting the relations between the observed quantities, which can only be interpreted as properties of the motions in the limiting classical cases [6]. Heisenberg, in contrast to Bohr, believed that the wave equation gave a causal, albeit probabilistic description of the free electron in configuration space [1].
The wave function itself is a function of time and space, and if the "wave-function collapse" at the time of measurement is probabilistic evolution, with instantaneous nature, [3], neither time (Δt=0) nor spatial transition is required. then it is in conflict not only with the Special Relativity, but also with the Uncertainty Principle. Because the wave function represents some definite energy and momentum, which appear to be infinite when required to follow the Uncertainty Principle [7], ΔE*Δt>h and ΔP*Δx>h.
V: We must also be mindful of the fact that the amount of information about a completely random event. From a quantum measurement point of view, it is infinite, since the true probability event of going from a completely unknown state A before the measurement to a completely determined state B after the measurement is completely without any information to base it on‡.
VI: The Uncertainty Principle originated in Heisenberg's analysis of x-ray microscopy [8] and its mathematical derivation comes from the Fourier Transform [8][10]. E and t, P and x, are two pairs of commuting quantities [11]. While the interpretation of the Uncertainty Principle has been long debated [7][9], "Either the color of the light is measured precisely or the time of arrival of the light is measured precisely." This choice also puzzled Einstein [12], but because of its great convenience as an explanatory "tool", physics has extended it to the "generalized uncertainty principle " [13].
Is this tool not misused? Take for example a time-domain pulsed signal of width τ, which has a Stretch (Scaling Theorem) property with the frequency-domain Fourier transform [14], and a bandwidth in the frequency domain B ≈ 1/τ. This is the equivalent of the uncertainty relation¶, where the width in the time domain is inversely proportional to the width in the frequency domain. However, this relation is fixed for a definite pulse object, i.e., both τ and B are constant, and there is no problem of inaccuracy.
In physics, the uncertainty principle is usually explained in terms of single-slit diffraction [15]. Assuming that the width of the single slit is d, the distribution width (range) of the interference fringes can be analyzed when d is different. Describing the relationship between P and d in this way is equivalent to analyzing the forced interaction that occurs between the incident particle and d. The analysis of such experimental results is consistent with the Fourier transform. But for a fixed d, the distribution does not have any uncertainty. This situation is confirmed experimentally, "We are not free to trade off accuracy in the one at the expense of the other."[16].
The usual doubt lies in the diffraction distribution that appears when a single photon or a single electron is diffracted. This does look like a probabilistic event. But the probabilistic interpretation actually negates the Fourier transform process. If we consider a single particle as a wave packet with a phase parameter, and the phase is statistical when it encounters a single slit, then we can explain the "randomness" of the position of a single photon or a single electron on the screen without violating the Fourier transform at any time. This interpretation is similar to de Broglie's interpretation [17], which is in fact equivalent to Bohr's interpretation [18][19]. Considering the causal conflict of the probabilistic interpretation, the phase interpretation is more rational.
VII. The uncertainty principle is a "passive" principle, not an "active" principle. As long as the object is certain, it has a determinate expression. Everything is where it is expected to be, not this time in this place, but next time in another place.
Our problems are:
1) At observable level, energy-momentum conservation (that is, causality) is never broken. So, is it an active norm, or just a phenomenon?
2) Why is there a "probability" in the measurement process (wave packet collapse) [3]?
3) Does the probabilistic interpretation of the wave function conflict with the uncertainty principle? How can this be resolved?
4) Is the Uncertainty Principle indeed uncertain?
------------------------------------------------------------------------------
Notes:
* Determinism here is a narrow sense of determinism, only for localized events. My personal attitude towards determinism in the broad sense (without distinguishing predictability, Fatalism, see [20] for a specialized analysis) is negative. Because, 1) we must note that complete prediction of all states is dependent on complete boundary conditions and initial conditions. Since all things are correlated, as soon as any kind of infinity exists, such as the spacetime scale of the universe, then the possibility of obtaining all boundary conditions is completely lost. 2) The physical equations of the upper levels can collapse by entering a singularity (undergoing a phase transition), which can lead to unpredictability results.
** Personal, non-professional opinion.
*** Energy conservation of independent wave functions is unquestionable, and it is debatable whether the interactions at the time of measurement obey local energy conservation [21].
† This is precisely the meaning of the Planck Constant h, the smallest unit of action. h itself is a constant of magnitude Js. For the photon, when h is coupled to time (frequency) and space (wavelength), there is energy E = hν,momentum P = h/λ.
‡ Thus, if a theory is to be based on "information", then it must completely reject the probabilistic interpretation of the wave function.
¶ In the field of signal analysis, this is also referred to by some as "The Uncertainty Principle", ΔxΔk=4π [22].
------------------------------------------------------------------------------
References
[1] Faye, J. (2019). "Copenhagen Interpretation of Quantum Mechanics." The Stanford Encyclopedia of Philosophy from <https://plato.stanford.edu/archives/win2019/entries/qm-copenhagen/>.
[2] Weinberg, S. (2020). Dreams of a Final Theory, Hunan Science and Technology Press.
[3] Bassi, A., K. Lochan, S. Satin, T. P. Singh and H. Ulbricht (2013). "Models of wave-function collapse, underlying theories, and experimental tests." Reviews of Modern Physics 85(2): 471.
[4] Schrödinger, E. (1926). "An Undulatory Theory of the Mechanics of Atoms and Molecules." Physical Review 28(6): 1049-1070.
[5] Bohr, N. (1937). "Causality and complementarity." Philosophy of Science 4(3): 289-298.
[6] Born, M. (1926). "Quantum mechanics of collision processes." Uspekhi Fizich.
[7] Busch, P., T. Heinonen and P. Lahti (2007). "Heisenberg's uncertainty principle." Physics Reports 452(6): 155-176.
[8] Heisenberg, W. (1927). "Principle of indeterminacy." Z. Physik 43: 172-198. “不确定性原理”源论文。
[9] https://plato.stanford.edu/archives/sum2023/entries/qt-uncertainty/; 对不确定性原理更详细的历史介绍,其中包括了各种代表性的观点。
[10] Brown, L. M., A. Pais and B. Poppard (1995). Twentieth Centure Physics(I), Science Press.
[11] Dirac, P. A. M. (2017). The Principles of Quantum Mechanics, China Machine Press.
[12] Pais, A. (1982). The Science and Life of Albert Einstein I
[13] Tawfik, A. N. and A. M. Diab (2015). "A review of the generalized uncertainty principle." Reports on Progress in Physics 78(12): 126001.
[15] 曾谨言 (2013). 量子力学(QM), Science Press.
[16] Williams, B. G. (1984). "Compton scattering and Heisenberg's microscope revisited." American Journal of Physics 52(5): 425-430.
Hofer, W. A. (2012). "Heisenberg, uncertainty, and the scanning tunneling microscope." Frontiers of Physics 7(2): 218-222.
Prasad, N. and C. Roychoudhuri (2011). "Microscope and spectroscope results are not limited by Heisenberg's Uncertainty Principle!" Proceedings of SPIE-The International Society for Optical Engineering 8121.
[17] De Broglie, L. and J. A. E. Silva (1968). "Interpretation of a Recent Experiment on Interference of Photon Beams." Physical Review 172(5): 1284-1285.
[18] Cushing, J. T. (1994). Quantum mechanics: historical contingency and the Copenhagen hegemony, University of Chicago Press.
[19] Saunders, S. (2005). "Complementarity and scientific rationality." Foundations of Physics 35: 417-447.
[21] Carroll, S. M. and J. Lodman (2021). "Energy non-conservation in quantum mechanics." Foundations of Physics 51(4): 83.
[23] Born, M. (1955). "Statistical Interpretation of Quantum Mechanics." Science 122(3172): 675-679.
=========================================================
Relevant answer
Answer
Dear Chian Fan
Thank you for your answer. However, when so many certainties as you state are already established in someone's worldview, I know of no way that they can be reconsidered. So I will not try to explain or argue.
The available alternate possibilities and possible solutions that electromagnetism put at our disposal all are available anyway for anybody interested in my articles, with direct references to all historical formal sources, all available on the internet, given that they all are now in the public domain.
You wrote: "The main purpose of this discussion is to have the Copenhagen Interpretation revisited. It gave quantum mechanics a grounding in the last century, but not a solid foundation, and may have hindered, restricted, or even misled physics today."
I completely agree that the Copenhagen interpretation must be gotten rid of.
This is what I have been working at for the past 25 years.
The Copenhagen interpretation has been the scourge of the past hundred years in physics, and it contributed absolutely nothing other than endless arguments and waste of time for mainstream in lieu of fundamental research, combined with general disregard and disappearing of any referencing to the historical foundational discoveries that underlie real physics.
I observed that the equations of QM always were fine as they were initially conceived and owe absolutely nothing to the Copenhagen interpretation. The only issue is that they have been disconnected from their historical classical formal grounding foundations by the Copenhagen interpretation, thus preventing further progress from their states established 100 years ago.
I expect that this will be remedied by the upcoming generation. My contribution is the set of historical formal references that I located over the past decades that they can lean on for the purpose.
Best Regards, André
  • asked a question related to Uncertainty
Question
3 answers
According to Shannon's definition, entropy measures information, choice, and uncertainty. Then, does negative differential entropy imply small uncertainty, less choice, and little information?
Relevant answer
Answer
### What is Differential Entropy?
First, let's understand what differential entropy is. Traditional entropy, as defined by Shannon for discrete random variables, is always non-negative. It measures the average unpredictability of a random variable. The more unpredictable it is, the higher its entropy.
However, when we move from discrete random variables (like flipping a coin) to continuous random variables (like measuring someone's height), we use differential entropy. Unlike the traditional entropy, differential entropy can be negative.
### Why Can Differential Entropy Be Negative?
Imagine you have a box, and inside this box, you have a certain amount of "information" or "uncertainty." In the discrete world, you can't have less than an empty box (0). But in the continuous world, it's like you can have a box that goes "underground" or "below the floor level." This "underground" space represents the negative values of differential entropy.
### What Does Negative Differential Entropy Mean?
Now, to the heart of the question: What does it mean when the box goes "underground"?
1. **Small Uncertainty**: A negative differential entropy doesn't necessarily mean there's "negative uncertainty" (because uncertainty can't be negative). Instead, it's a relative measure. Compared to a reference distribution (like the Gaussian distribution), a negative value indicates that the distribution in question is "more certain" or "less spread out."
2. **Less Choice**: In the context of information theory, "choice" refers to the number of possible outcomes or the spread of a distribution. A negative differential entropy suggests that the distribution is more "peaked" or "concentrated" around certain values, implying fewer choices or less variability.
3. **Little Information**: Information, in this context, refers to the unpredictability or randomness of outcomes. A more negative differential entropy means the outcomes are more predictable, and thus, there's less new information to be gained from observations.
### In Simple Terms:
Imagine you're trying to guess the weight of a random apple from a basket. If almost all apples weigh the same (say, around 150 grams), then your guess will likely be close, and there's little uncertainty or surprise. This situation can be represented by a negative differential entropy. On the other hand, if the weights of apples vary a lot, it's harder to guess, and there's more uncertainty, leading to a higher entropy value.
So, negative differential entropy doesn't mean "negative information" or "negative uncertainty." Instead, it's a way to say that, relative to some reference, the continuous random variable in question is more predictable, concentrated, and offers less new information.
  • asked a question related to Uncertainty
Question
2 answers
I am curious about the latest AI and ML models or methods that are being utilized to manage uncertainties and risks in Supply Chain Management. I am interested in how these models identify, evaluate, and mitigate risks. Any examples of industries where these models have been particularly successful would be beneficial.
Relevant answer
Answer
Ahmad Al Khraisat "Profound thanks for your insights; your synthesis of epistemological nuances and empirical methodologies has significantly enriched the discourse here."
  • asked a question related to Uncertainty
Question
2 answers
Hi all,
Can you tell us how COVID 19 has triggered most of the innovative processes in the public and private sectors during this period of uncertainty? and from the non-profit sector? in your country?
Relevant answer
Answer
Here are some ways in which the COVID-19 crisis has spurred innovation:
  1. Telemedicine and Digital Health Solutions: The need for remote healthcare services during the pandemic led to a rapid expansion of telemedicine and digital health platforms. Virtual consultations, remote monitoring devices, and telehealth solutions became essential tools for delivering healthcare services while minimizing physical contact.
  2. E-Learning and Remote Education: With schools and universities closed during lockdowns, there was a significant surge in e-learning platforms and remote education solutions. Educational institutions and edtech companies developed innovative ways to deliver online classes and interactive learning experiences.
  3. Work from Home and Virtual Collaboration: The shift to remote work prompted the adoption of virtual collaboration tools and platforms. Video conferencing, online project management, and cloud-based collaboration tools became essential for maintaining productivity and communication.
  4. Vaccine Development and Distribution: The urgency to combat the pandemic accelerated vaccine development and approval processes. Collaborative efforts between pharmaceutical companies, researchers, and governments led to the rapid development and distribution of COVID-19 vaccines.
  5. Supply Chain and Logistics Innovations: The pandemic exposed vulnerabilities in global supply chains. Companies and governments sought to enhance supply chain resilience through innovations in logistics, inventory management, and demand forecasting.
  6. Digital Payments and Contactless Services: The fear of virus transmission through physical currency led to an increased adoption of digital payment methods and contactless services in retail and banking sectors.
  7. Remote Entertainment and Streaming Services: The entertainment industry saw a surge in demand for streaming services, online gaming, and virtual events as people sought entertainment options at home.
  8. Personal Protective Equipment (PPE) Innovation: There was a rapid development of new PPE designs and materials to meet the increased demand for protective gear for healthcare workers and the general public.
  9. Data Analytics and Modeling: Data analytics and modeling played a crucial role in tracking the spread of the virus, predicting hotspots, and informing public health measures.
  10. AI and Automation in Healthcare: AI and automation technologies were utilized in healthcare settings for tasks like patient monitoring, diagnosis, and drug discovery.
  11. Community Initiatives and Social Innovation: Community-led initiatives and social innovation played a significant role in addressing various challenges posed by the pandemic, including food distribution, mental health support, and community solidarity.
  12. Hygiene and Sanitization Innovations: Various hygiene and sanitization innovations, such as touchless technology, UV sterilization devices, and antimicrobial materials, emerged to mitigate the spread of the virus.
  • asked a question related to Uncertainty
Question
1 answer
Can anyone suggest to me
1. "how to calculate uncertainty for Redlich-Kister coefficients?"
2. Uncertainties of the limiting partial molar properties can be calculated from the uncertainties of the Ai (Redlih-Kister) parameters
Relevant answer
Answer
You can use a free program, such as Scidavis
  • asked a question related to Uncertainty
Question
8 answers
Will academics EVER stop anthropomorphizing "probabilistic uncertainty"? It is something "seen" in "findings" AND (to say the least) not SOME THING. It may well be mainly connected to poor observations OR very preliminary "discoveries". Do people really believe that probabilistic uncertainty can be hard-wired?? Unless you have evidence in real and appropriate actual contexts, such as a naturalistic could SEE, OR at least AS seen sometime(s) in ontogeny with DIRECT OVERT EVIDENCE , then [otherwise] : STOP IT STOP! Understand?.
Relevant answer
Answer
OK. We tried.
  • asked a question related to Uncertainty
Question
30 answers
Every organization that strives to survive, to develop and to be sustainable, must be ready to face all the challenges that today’s turbulent and uncertain times carry with them. Organizations of all types and sizes are faced by external and internal factors and influences that make it uncertain whether they will achieve their objectives. The almost unimaginable pace of technical and technological progress, the dramatic acceleration of changes in all spheres of life, as well as the general feelings of uncertainty, actually can raise the question as to what extent is prevention still really possible at all?
Relevant answer
Answer
Today, "prevention by design" is necessary. When creating a product or designing a service, it is necessary to assess possible risks and prevent them immediately. Example: packing products in environmentally friendly packaging or developing operating systems without security flaws or generously rewarding employees so they don't reveal professional secrets....
  • asked a question related to Uncertainty
Question
4 answers
The statement inquires about the potential mathematical relationship between entropy and standard deviation. Entropy and standard deviation are both concepts used in statistics and information theory.
Entropy is a measure of uncertainty or randomness in a probability distribution. It quantifies the average amount of information required to describe an event or a set of outcomes. It is commonly used in the field of information theory to assess the efficiency of data compression algorithms or to analyze the randomness of data.
On the other hand, standard deviation is a statistical measure that quantifies the dispersion or variability of a set of data points. It provides information about the average distance of data points from the mean or central value. It is widely used in data analysis to understand the spread of data and to compare the variability among different datasets.
While entropy and standard deviation are both statistical measures, they capture different aspects of data. Entropy focuses on the uncertainty or information content, while standard deviation focuses on the dispersion or variability. As such, there is no direct mathematical relationship between entropy and standard deviation.
However, depending on the specific context and the nature of the data, there might be some indirect connections or relationships between entropy and standard deviation. For instance, in certain probability distributions, higher entropy might be associated with higher variability or larger standard deviation, but this relationship is not universally applicable.
In summary, while entropy and standard deviation are both important statistical measures, they serve different purposes and do not have a direct mathematical relationship. The relationship between them, if any, would depend on the specific characteristics of the data being analyzed.
Relevant answer
Answer
I appreciate your reply, and it has provided me with a clear understanding of the connection between entropy and standard deviation.
  • asked a question related to Uncertainty
Question
4 answers
Artificial Intelligence in Petroleum Engineering
1. Whether AI alone - would be able to mimic a real oil/gas field production scenario - in the absence of reservoir simulation?
2. What would be the various sources of 'uncertainty' that would be associated with an AI technique?
3. How would AI consider ‘measurement uncertainties’ (errors in measurement of state variables associated with reservoir rock properties and rock-fluid interaction properties) as well as ‘structural uncertainties’ (errors associated with the mathematical representation of actual draining principles of hydrocarbons) "explicitly"; along with ‘parametric uncertainty’ (resulting from the coupled effect of both measurement as well as structural uncertainties)?
4. How would AI do justice – particularly with the very limited data set associated with ‘reservoir permeability’?
5. Whether AI would reasonably support a hydrocarbon reservoir with a sparse, heterogeneous, anisotropic, multi-phase, multi-dimensional data?
6. Whether AI would be supplied with the best ever algorithms for coping with all kinds of uncertainties – by explicitly segregating the various forms of uncertainties by an efficient data training?
7. Whether AI-powered robots will be able to detect the oil seeps (in deep sea) efficiently by mitigating the exploration risk while lessening the harms to marine life?
8. Whether AI has the ability to forecast ‘well collapses’ – well before its occurrence?
To what extent, ‘downtime’ would be expected to be reduced - upon introducing ‘traffic light system’?
9. To what extent, the concept of ‘digital twins’ remains efficient in addressing the challenges associated with the hydrocarbon industry?
10.                  To what extent, AI would remain helpful – by efficiently recognizing patterns through deep learning – towards averting the catastrophes associated with HSE?
Relevant answer
Answer
International Journal of Artificial Intelligence in Education (2000), 11, 122-143
122
The roles of models in Artificial Intelligence and Education
research: a prospective view
Michael Baker GRIC-COAST, CNRS & Université Lumière Lyon 2, 5 avenue Pierre
Mendès-France, 69676 Bron Cedex, France, email: mbaker@univ-lyon2.fr
Abstract. In this paper I speculate on the near future of research in Artificial Intelligence and
Education (AIED), on the basis of three uses of models of educational processes: models as
scientific tools, models as components of educational artefacts, and models as bases for design
of educational artefacts. In terms of the first role, I claim that the recent shift towards studying
collaborative learning situations needs to be accompanied by an evolution of the types of
theories and models that are used, beyond computational models of individual cognition. In
terms of the second role, I propose that in order to integrate computer-based learning systems
into schools, we need to ’open up’ the curriculum to educational technology, ’open up’
educational technologies to actors in educational systems and ’open up’ those actors to the
technology (i.e. by training them). In terms of the third role, I propose that models can be bases
for design of educational technologies by providing design methodologies and system
components, or by constraining the range of tools that are available for learners. In conclusion I
propose that a defining characteristic of AIED research is that it is, or should be, concerned
with all three roles of models, to a greater or lesser extent in each case.
"Wenn wir an die Zukunft der Welt denken,
so meinen wir immer den Ort, wo sie sein
wird, wenn sie so weiter läuft, wie wir sie
jetzt laufen sehen, und denken nicht, daß
sie nicht gerade läuft, sondern in einer
Kurve, und ihre Richtung sich konstant
ändert."
"When we think of the world’s future, we
always mean the destination it will reach
if it keeps going in the direction we can
see it going in now; it does not occur to
us that its path is not a straight line but a
curve, constantly changing direction."
Wittgenstein (1980), pp. 3 / 3e
.
1. INTRODUCTION
If, as some anthropologically-minded archæologists would claim, the present is the key to the
past, then perhaps the future is the key to the present? In this paper I assume the converse —
that the present and the past are keys to the future — for the case of research in the field of
Artificial Intelligence and/in Education (henceforth abbreviated to "AIED").
Any view of what objectives a research field may achieve in the future must be based on a
view of the nature of the field in question, up to the present day. I characterise the past, the
present and the near future of AIED research in terms of a combination of different roles played
by models of educational processes, namely: models as scientific tools, models as components
of educational artefacts, and models as bases for design of educational artefacts.
It should be noted that the views expressed here are not those of an objective historian of
science, but rather of a researcher engaged in the field that is being discussed. In that case,
description, prediction and prescription coincide to a certain extent.
One could say that there are basically three sorts of argumentative texts: those that argue
(mostly)
  • asked a question related to Uncertainty
Question
3 answers
I calculated the uncertainity for the all the elements and ions according to EPA guideline. However I am stuck with the PM2.5 concentration uncertainity. The EPA dataset from Baltimore estimated the uncertainty of PM2.5 by dividing by 10, but the data from St. Louis is multiplied by 12.
SO, how can I calulate the uncertainity for PM2.5 data?
Relevant answer
Answer
I am almost sure the St. Louis dataset has switched uncertainties and concentrations. Then the unc. is about 8%.
  • asked a question related to Uncertainty
Question
5 answers
In a world characterized by uncertainty and change, sustainable innovations are a key factor for the future of our humanity. A new mindset and approach to innovation is needed to find sustainable solutions that meet both environmental and social needs.
What is your meanings, informations and suggestions?
Relevant answer
Answer
Some years back I have expressed my views regarding sustainable innovation which I submit herewith for your perusal .
To speak regarding the global sustainable development ,I do not wish to prognosticate in our present phase as we all have observed & noticed the creation of natural forces everywhere in the world including Man made creation .
In this line to talk about 10 years time we do not know how the planetary influence can work in the environment through the universe .
This is my personal opinion
  • asked a question related to Uncertainty
Question
1 answer
We all know that digital drugs have negative effects on the performance of workers or students in universities and schools, some of which are psychological, social and economic, but the question is what are the types of these drugs and what are the criteria on the basis of which the decision maker can judge each of these types in light of uncertainty
Relevant answer
Answer
Hi! Alaa Alden.
Digital drugs, also known as binaural beats or brainwave entrainment, are audio tracks designed to induce specific states of mind or alter consciousness. They are typically used for relaxation, meditation, sleep enhancement, focus improvement, or mood enhancement. When it comes to evaluating these types of digital drugs using Fuzzy Multiple Criteria Decision Making (MCDM), decision-makers can adopt the following criteria:
  1. Effectiveness: This criterion measures the effectiveness of the digital drug in achieving the desired outcomes. It can be assessed by considering factors such as user reviews, scientific studies, and empirical evidence of the drug's effectiveness.
  2. Safety: Safety is a crucial criterion that evaluates the potential risks and side effects associated with using the digital drug. Decision-makers should consider the possible adverse effects, contraindications, and long-term consequences.
  3. Ease of Use: This criterion assesses the usability and accessibility of the digital drug. Factors such as user-friendliness, compatibility with different devices or platforms, and ease of integration into daily routines should be considered.
  4. Quality of Audio: The quality of the audio tracks is an important criterion for evaluating digital drugs. Decision-makers can consider factors such as sound clarity, production quality, and overall listening experience.
  5. Customizability: Customizability refers to the flexibility and adaptability of the digital drug. Decision-makers can assess the availability of different frequency ranges, adjustable parameters, and the ability to tailor the experience to individual preferences.
  6. Scientific Validity: This criterion evaluates the scientific basis and credibility of the digital drug. Decision-makers can consider whether the claims made by the drug's developers are supported by scientific research and if the drug follows established principles of brainwave entrainment.
  7. Cost: Cost is an important practical criterion for decision-makers. It assesses the affordability and value for money of the digital drug, considering factors such as price, available subscription plans, and additional features or bonuses.
  8. User Feedback: Decision-makers can also consider user feedback and ratings from individuals who have already used the digital drug. This criterion provides insights into the real-world experiences and satisfaction levels of users.
By adopting fuzzy MCDM techniques, decision-makers can assign linguistic variables and fuzzy numbers to each criterion and use fuzzy logic to handle uncertainty and vagueness in the evaluation process. Fuzzy MCDM methods such as the Fuzzy Analytic Hierarchy Process (F-AHP) or Fuzzy Technique for Order of Preference by Similarity to Ideal Solution (F-TOPSIS) or Bipolar Fuzzy TOPSIS (BF-TOPSIS) or Bipolar Fuzzy Elimination and Choice Transforming Reality (BF-ELECTRE-I) can be applied to calculate the overall ranking or scores of different types of digital drugs based on the selected criteria. I recommend (BF-TOPSIS) and (BF-ELECTRE-I) because bipolar disorder is specifically used in medicine.
I hope this answer helps you.
  • asked a question related to Uncertainty
Question
1 answer
Heighenberg Uncertainty Principle has been found applicable only applicable to atomic systems while Quantum Theory of Uncertainty Principle of Integral Space has been foundle suitable for all the systems of Universe and Nature.Thats why it has been pronounced as "Quantum Theory of Everything" which was a dream of many pioneers including Albert Einstein ,Neels Bohr, Schrodinger,Tesla etc.
For details following recent three research papers have been uploaded.
Relevant answer
Answer
It has been also observed that Heighenberg Uncertainty Principle is deriveable from Quantum Theory of Uncertainty Principle of Integral Space.Not only this but other Pioneer's theories have been also derived from present Integral Space Quantum Theory.
  • asked a question related to Uncertainty
Question
1 answer
It is known that Heighenberg Uncertainty Principle is applicable to only atomic systems but how the dynamics of molecular,biomolecular,Biochemical,biomedical and social systems may be discussed?
Recently an " Uncertainty Principle of Integral Space " has given satisfactory results in all above systems.
Above principle has been discussed in form of A Quantum Model . Details may be seen in attached recent research paper.
Relevant answer