Science topic
Nature of Reality - Science topic
Group explores nature of reality
Questions related to Nature of Reality
In this research, I propose a visionary approach aimed at cultivating a united and forward-thinking human generation dedicated to ensuring the survival of our species. The strategy involves initiating a comprehensive, community-supported effort to instill a mindset in individuals who will serve as pioneers in the colonization of outer space. Recognizing the inadequacy of short-term planning for potential extinction events, the emphasis is on nurturing a global community mindset devoid of borders and racial distinctions.
To address the complexity of this undertaking, this research suggests a multi-study approach to fully comprehend and implement the vision. While certain steps are achievable in the coming decades, the realization of the ultimate goal relies on the commitment of future generations. The narrative acknowledges the necessity of steering the present society away from a downward trajectory, envisioning a planned human society shaped by a succession of generations. The expectation is that these future generations will either follow the guidance presented in this research or evolve it further based on emerging socio-economic and natural realities.
As an international student in Canada with a background in management and business, my perspective stems from a synthesis of diverse sources. Observing a societal decline towards a historically fortified social system, I am compelled to advocate for a timely and conventional shift in our societies. This research underscores the urgency of action to avert inevitable destruction and chaos, proposing a transformative vision for the collective future of humanity.
As we embark on this visionary exploration of cultivating a united and spacefaring generation, questions naturally arise. Will this research be useful in steering our societies towards a more sustainable future? Are there fellow researchers pursuing similar approaches, and can we join forces for a collective impact? I invite those who resonate with this vision to connect, share insights, and explore avenues of collaboration. Together, we can contribute to shaping a future that transcends boundaries and ensures the survival and prosperity of humanity.
I invite anyone to participate to an open discussion on the latest “findings” on Black-Holes' research. The motive of this thread is a set of articles appeared in the issue of September 2022 (p. 26-51) of Scientific American magazine under the title “Black Hole Mysteries solved”.
I have proposed a new way of thinking about Nature/Reality NCS(Natural Coordinate System) (https://www.researchgate.net/publication/324206515_Natural_Coordinate_System_A_new_way_of_seeing_Nature?channel=doi&linkId=5c0e3a7d299bf139c74dbe81&showFulltext=true) and I would ask whether you recognize any basic distinction between the above preprint(and the following Appendices) and the articles of Sci. Am.. This thread is intended to be an open– in respect to time and subject - discussion forum for the latest results of Black Hole research in order to advance new perspectives based on NCS and to put the proposals of NCS to the public assessment.
In order to seed points of arguments, I picked up some phrases from the articles of SciAm in comparison to phrases or references from NCS preprint.
- “Paradox Resolved” by G. Musser. “Space looks three-dimensional but acts as if it were two-dimensional.” (p.30) → NCS (p.11-13, 49-52).
- - “It says that one of the spacial dimensions we experience is not fundamental to nature but instead emerges from quantum dynamics” (p.31) → NCS (p.11-13).
- - “Meanwhile theorists think that what goes for black holes may go for the universe as a whole” (p.31) → NCS (p.31-38, 46-47).
- “Black Holes, Wormholes and Entanglement” by A. Almheiri- “The island itself becomes nonlocally mapped to the outside” (p.39) → NCS (p.44-47), https://www.researchgate.net/publication/345761430_APPENDIX_18_About_Black_Holes?channel=doi&linkId=5facf0fe299bf18c5b6a0d4d&showFulltext=true .
- “A Tale of Two Horizons” by E. Shaghoulian- The whole article is about BH-Horizon, Holographic Principle, Observer, and Entropy → NCS (p.31-38, 44-47, 54-61, 6-7), https://www.researchgate.net/post/What_is_Entropy_about_Could_the_concept_of_Entropy_or_the_evaluation_of_its_magnitude_lead_us_to_the_equilibrium_state_of_a_system .
- “Portrait of a Black Hole” by S. Fletcer- The article is about the history of the observation of Sagittarius A* (the BH at the center of Milky Way galaxy). There is no obvious connection with NCS.
PS. This discussion is NOT open for new “pet-theories” apart from NCS.(!!!)
There are already AI machines who can feel their environment, solve problems, prove theorems, play games, make art and even socialize with people etc. A hypothetical futuristic machine who will be able to do all the above, should be considered to have it's own consciousness? if the answer is no, how can we be sure?
The original meaning of the word "theory" comes close to "view", or even "world view". As such it has already been used by the ancient Greek philosophers, e.g. Aristoteles or Plato. Over the centuries, its meaning has become more and more precise, culminating in a well-defined logical notion of the correspondence between a part of the (outer) real world and the (inner) symbolic world we use to think about or describe it.
In more popular parlance, Wikipedia summarizes it in the statement: "A theory is a rational type of abstract thinking about a phenomenon or the results of such thinking." *) Of course, what is meant with "phenomenon" (also an ancient Greek word) is typically left unspecified: it may be a very specific class of objects or events, or it may be something as big as our universe (as in "cosmological theory").
Over the years, I have observed a gradual inflation of the technical term "theory" as defined and used in scientific methodology. The (dualistic) notion of a correspondence between the real world on the one hand and the media we use to reflect about the latter (thought, language, ...) on the other hand seems to have been lost during the rise of empirical research with its strong emphasis on "phenomena" instead of "thoughts".
The result is that the technical term "theory" appears to have also lost its well-defined meaning of a bridge between our outer world "as we observe it" and our inner world "as we reason about it". For instance:
- In a recent paper (2021), the author (a well-known expert in a subfield of social science) promises to offer a theory (sic!) of a particular "phenomenon" in his subfield. As I am also much interested in the kind of phenomena he is doing research about, I of course hoped to find - at least - a worked-out theoretical model of those phenomena.
- Far out! Besides a simple flow-chart of (some of) the processes involved, what he presented was a large collection of more or less confirmed "empirical facts" together with simple "interpretations" (mostly re-wordings) and pointers to possible or plausible relationships.
- I didn't find any sign of the hallmarks of a good theory: a worked-out theoretical model of those phenomena, on the basis of which I (or someone else) could reason about those phenomena, look for inconsistencies between assumptions and facts, derive crucial hypothesis to be tested, etc.: !
My questions to you:
- What are your experiences with this type of inflated use of the word "theory" in scientific research?
- Do you believe that there is a difference in this respect between social sciences and natural sciences?
- How can we bring the "empirical approach" and the "theoretical approach" together, again?
________________________________________
Given recent debate, I was wondering whether anyone else had read Einstein's (1936) "Physics and Reality" and - if they had - whether they found it pertinent to the debates here.
I've got a pdf of it if anyone wants it...
We have many technical issues in Mixed Reality. Which is the most important technical challenge for Mixed Reality?
Specifically I am interested in knowing if anyone has had students compare the concepts of parallel universes (multiverse) proposed by both theories?
What is consciousness? What do the latest neurology findings tell us about consciousness and what is it about a highly excitable piece of brain matter that gives rise to consciousness?
The origin of gravitation, the origin of electric charge and the fundamental structure of physical reality are resolved, but these facts are not yet added to common knowledge. Also the structure of photons is resolved and the begin of the universe is explained. A proper definition of what a field is and how a field behaves have been given. These facts are explained in .
This model still leaves some open questions. The model does not explain the role of massive bosons. It does not explain the existence of generations of fermions. The HBM also does not provide an explanation for the fine details of the emission and absorption of photons. The model does not give a reason for the existence of the stochastic processes that generate the hopping paths of elementary particles. The model does not explain in detail how color confinement works. It also does not explain how neutral elementary particles can produce deformation. The referenced booklet treats many of its open questions in sections that carry this title.
The model suggests that we live in a purely mathematical model. This raises deep philosophical questions.
With other words, the Hilbert Book Model Project is far from complete. The target of the project was not to deliver a theory of everything. Its target was to dive deeper into the crypts of physical reality and to correct flaws that got adapted into accepted physical theories. Examples of these flaws are the Big Bang theory, the explanation of black holes, the description of the structure of photons, and the description of the binding of elementary particles into higher order modules.
The biggest discovery of the HBM project is the fact that it appears possible to generate a self-creating model of physical reality that after a series of steps shows astonishing resemblance to the structure and the behavior of observed physical reality.
A major result is also that all elementary particles and their conglomerates are recurrently regenerated at a very fast rate. This means that apart from black holes, all massive objects are continuously regenerated. This conclusion attacks the roots of all currently accepted physical theories. Another result is that the generation and binding of all massive particles are controlled by stochastic processes that own a characteristic function. Consequently the Hilbert Book Model does not rely on weak and strong forces that current field theories apply.
The HBM explains gravity at the level of quantum physics and thus bridges the gap between quantum theory and current gravitation theories.
The Hilbert Book Model shows that mathematicians can play a crucial role in the further development of theoretical physics. The HBM hardly affects applied physics. It does not change much in the way that observations of physical phenomena will be described.
450 B.C. Zeno formulated a paradox that Achilles never reach a Tortoise, because the Tortoise gets an additional distance for the time Achilles reachs its preliminary position.
The resolution of the paradox is in summing of an infinite sequence of decreasing time intervals. The result can be expressed such a way: dt=dt0/(1-v/c). Here dt - time interval for Achilles to reach the Tortoise, dt0 - time interval for Achilles to reach the Tortoise initial position. v - velocity
of the Tortoise and c - velocity of Achilles. These speculations are very close to the Einstein's speculstions about the clocks synchronization. Only difference that Einstein considered propagation of light (Achilles) in to and inverse directions. The main cause of the Lorentz' transforms
( dt=dt0/sqrt(1-(v/c)^2) ) is invariancy of the Maxwell's Equations of ds^2=c^2*dt^2-dr^2.
So, we have a contradiction. From experimental data we sure in validity of the Maxwell's equation and symmetry of space to reflection. But, why not use as invariant dS=c*dt-|dr|?
Only objection can be breaking of the analysity for the such invariant because of the discontinuity of the first derivative in the r=0. But this fact can be validate by Coulomb's Law, which can be explained from breaking of the analysity.
If someone can share ideas about this concrete problem you are welcome.
From the (2002) review by Roger Egbert:
At a time when movies think they have to choose between action and ideas, Steven Spielberg's "Minority Report" is a triumph--a film that works on our minds and our emotions. It is a thriller and a human story, a movie of ideas that's also a whodunit. Here is a master filmmaker at the top of his form, working with a star, Tom Cruise, who generates complex human feelings even while playing an action hero.
See:
The opening scene, demonstrating the effectiveness of crime prevention, based on mysterious predictions of the “pre-cogs,” contrasts with the account of the predictions involving the search for a “minority report.” Though the precogs, it is said, “are never wrong,” sometime they disagree among themselves. The hunt for the dissenting view leads on into political intrigue—which may explain our skepticism of the prediction of crimes –on the part of “the usual suspects.”
This has been an issue that had concerned me for some time now. I would be grateful if you can share with me your thoughts on it.
So far as I know, psychology as a field is equivocal in the respect it receives from other scientific disciplines. It is a field criticized for everything from its subject matter (e.g. being a person-centered soft-science) and its deficient method (e.g. lack of causality). I disagree with these claims, and believe that they are a symptom of irreverence and conceitedness in our times, specifically of a culture prior to its feminine aspects coming fully to the fore. However, I myself have become disillusioned with the field. I thought it too anthropocentric (too self-focused at times), and may not be able to address fundamental issues as physics is able to. My only solution to ressuscitate my interest in the field is the pursue an understanding of consciousness, which, as Nagel would argue, may transform our view of physics and biology. I still feel confined by my discipline. What would your answer be to this question?
It is said of general relativity that it has been experimentally proven.
But what about experiments involving black holes and the recent LIGO experiments - do they really uphold GTR?
We consider three aspects of the thought of Bachelard with the aim of an education in the scientific thought: some epistemological obstacles more frequently met in education (La formation de l’esprit scientifique, 1938); the way the scientific reasoning concerns the whole experimental approach and effort of theorization; the complementarities between the scientific process and the literary, artistic or philosophic process in the approach of nature. Did people forget the philosophy of nature in Bachelard's works? It is convenient to present the surrounding nature as a set of realities to be approached under diverse looks: scientific, artistic, philosophic, poetic …
We (Durham University) are performing photoassociation spectroscopy on ultracold YbCs (one colour PAS). We have found a typical sequence of lines out to -dv=16 (going deeper into the well). The next line we can't find, then the next line is about 20 times broader and departs from the LeRoy-Bernstein progression. The -dv=19 also departs from the LRB fit and is about 5 times wider than the 'usual' width. Does this sound like the behaviour of an avoided crossing? I would be interested to know of similar examples.
If the cosmos was created out of nothingness in the Big Bang, what determined the size (or scale) of fundamental particles, like the proton and the electron? In the past people thought that GOD determined their scale, but this is not a scientific answer. So, what may have determined the scale of material existence?
The binary nature of the computing process, was presumably inspired by the philospher/scientist, Liebniz, who was apparently familiar with the "I Ching" , The Book of Changes. This ancient Chinese classic gives a description of events using Hexagrams, that have their basis the interplay of two primary forces, the binary combination of Yin and Yang.
What then could be another architecture? The ancient Indian text, "The Bhagavad Gita", refers to events being the interplay of not two, but three forces: namely, active, passive and neutral (or passion, ignorance and purity). Could this ternary combination also be the basis for a computing architecture?
Erik verlinde said; this emergent gravity constructed using the insights of string theory, black hole physics and quantum information theory(all these theories are struggling to take breath)..its appreciation to Verlinde of his dare step of constructing emergent gravity based on dead theories ..we loudly take inspiration from him...!!!!!!!
1. JUST AN EXAMPLE FROM QUANTUM MECHANICS
When the curious and inquisitive people with different educational and professional backgrounds show an interest in my discovery concerning Lyapunov functions in the mathematical theory of stability, they often ask me the same question. How is the theory important to us and our social being? Maybe it has a quite narrow range of applications like the overwhelming majority of systems of ideas, they say. Despite having a good expertise in the theory of stability and extensive engineering and life experiences I always found it difficult to set forth the convincing arguments substantiating the fact that the concept of stability has an unique place among all other theories. Furthermore, in my personal opinion it is a core idea of our existence. You might say that this is a too bold statement. Today I will try to persuade all of you with one example.
The modern era, defined by computer and other various electronic technologies, takes its origin in quantum mechanics that was born in 1900 due to the Planck’s postulate of the quantum nature of atomic world with the aim "to prevent ultraviolet catastrophe from happening". The alternative motivation for the creation of quantum mechanics is the need to explain the stability of the structure of atom, namely why negatively charged electrons do not crash in positively charged nuclei. This is the most fundamental question of our existence and it has not been answered yet even by quantum theory. Why? Because the theory does not explain the stability of the structure of atom. The one just postulates it. See a good explanation of it to my eye given at the link below https://www.amazon.com/gp/review/R1F8RI0MXBZQF3?ref_=glimp_1rv_cl
All around us including ourselves consists of atoms being ones of the basic elementary bricks of the matter. Thus, if we cannot understand why the atoms have many stable forms of existence and do not collapse immediately after their creation, then it is highly likely that we understand a little properly about ourselves and the world we live in. In other words, we just think that we understand much more complicated formations of the matter but our understanding might be inadequate.
This simple example foregrounds two obvious and important points, namely it
• shows how easy the concept of stability can undermine our faith in the dogma of modern knowledges;
• explains the ubiquity and universality of the property of stability in the cosmos with the omnipresence of atoms as its building blocks that spread this intrinsic feature throughout it all.
2. THE NONPAREIL PHENOMENON OF THE PROPERTY
The stability of any physical entity can be defined as the intrinsic ability of the one to return to its previous state by itself after being forced out of it under the influence external or internal causes. We deal with stability every moment of our lives, from birth to death. And there is no other chance to live. Moreover, the Universe exists only because the matter is mostly stable. Some examples of the remarkable actions of the property of stability, which we take for granted, are given below.
• If you have a bone fractured, the stability of your organism functioning ensures that its ends will coalesce soon. In general, if you sick, then your recovery happens only because of the stability provided by your immune system. This means to continue to live a healthy life for you. Even the process of aging can be mathematically formulated as a problem of stability.
• If you fly a plane in the condition of air turbulence, the stability of the aircraft ensures the return to the trimmed (balanced) condition if it is disturbed. This means a safe flight for you.
• If you drive a car on a slippery road, then the car’s dynamic stability control prevents you from a loss of steering control. This means a safe ride for you.
• If you have tripped over something, then your brain using vestibular sense detects it and sends the corresponding commands to your body to maintain the balance. This means to avoid the fall and possible injury for you.
• If the operational system of your computer has become completely unresponsive, then what do you usually do? Correctly, you restart the computer and the good operational system tries to fix the glitch by means of returning to its previous healthy state. This is an instance of a good man-made imitation of Mother Nature miraculous design pattern.
3. ANCIENT WRITTEN LANGUAGES
Sir Isaac Newton is famously known, first of all, for his Philosophiæ Naturalis Principia Mathematica written in Latin, the international scientific language in the Age of Exploration. Today modern English has successfully replaced Latin in this role. But not too many people know that he is an author of a number of good works in theology and linguistics. Isaac Newton even wanted to create an universal language on the basis of English, which he considered so imperfect that he was afraid for its future among other European languages. This may cause a smile but the truth is that such kind of Cosmic Irony can toy even with the mighty minds of the greatest mortals. There are the written language sciences with a long history rooted in the very distant eras of Mesopotamian, Ancient Egyptian and Minoan, etc. civilizations. What conclusion have these sciences led us to? Any language can become extinct with time. Moreover, the one can disappear almost without traces in the next generations of languages. It means it will not modify, merge with or give rise to other languages like Latin. No. It will just vanish off the face of the earth. This is the death of the language and it is usually very difficult to predict which modern language will disappear and which will continue its life in another form. The situation with languages can be also formalized as a problem, extremely complicated one, of stability of the active use of languages. But the pith in the example with languages is not stability. The pith here and in general is in the difficulty to dig out the more or less adequate knowledge about the real facts, properties, etc., including the factor of stability, and their future. All we know is just some approximation of the reality reflected in our mind in some still unknown to us way and no more. We call this reflection understanding or knowledge. It can be initial or advanced. It differs from other understandings by limitations of its applications, degrees of adequacy, the forms of representation, etc. but it always remains only an approximation of the reality, not the reality per se. It is not rare to make an initial well-adequate approximation of a new specific knowledge equals to accomplish a feat. And the main reason of it often is the lack of necessary technological tools, research techniques, mastery and gift of scholar, sometimes a lucky concurrence of circumstances and a full set of the elements or pieces of the required knowledge. The process of creating the initial approximation of this new knowledge is similar to assembling a jigsaw puzzle with oddly shaped interlocking and tessellating pieces. If some pieces are missing or misshapen then the whole puzzle cannot be complete properly. Of course, the one can be “forcibly” assembled in a wrong way but presented in a credibly looking and prepossessing manner to pass it off as the genuinely correct assembly. We will dismiss such cases out of hand because of obvious reasons.
Now let us turn our attention to the front image of the post. It is of the Phaistos disk that is “a fired clay disk, probably of Minoan origin, measuring some 16cm in diameter and impressed on both sides with 242 symbols set in a spiral arrangement. As yet, this unique archaeological find remains an undeciphered enigma … The disk is now generally accepted as Cretan in origin and therefore is probably a representation of the Minoan language in use during the period at which scholars date the disk - from 1850 to 1550 BCE … The fact that the symbols are arranged in a spiral is also given as evidence supporting the Minoan (or at least Aegean) origin.” Cartwright, M. (2012, June 28). Phaistos Disk. Ancient History Encyclopedia. Retrieved from http://www.ancient.eu/Phaistos_Disk/. The many facts about the disk have been hotly debated among scholars almost immediately since it was unearthed in 1908 including authenticity, dating, origin, symbols, etc. A number of scholars have claimed deciphering the disk but the scientific community has stayed skeptical. The reason is simple: too little material of this lost language presented with artifacts is at scholars' disposal in order to be able to conduct thorough comparative analysis. Here we encounter the typical and quite frequent case of missing jigsaw puzzle pieces.
4. EARTH'S CLIMATE STABILITY AND HYPOTHESIS OF GLOBAL WARMING
The alarming signs of climate change are evident: sea level rise, changes in climate extremes, Arctic sea ice decline, glacier retreat, etc. Most initial causes-factors of temperature change are believed to be known well: greenhouse gases, aerosol and soot produced by volcanoes and human-made pollutants, solar activity, etc.
However, the following questions then arise:
• How significant is contribution share of each above-mentioned factor?
• Which factors are major and which are minor?
• Is the list of the aforesaid factors quite complete? Are there not missing some very important ones but still unknown?
• Is the period of a hundred year of the well registered and documented observations of climate change sufficient to make long-term predictions?
• How well adequate are the global mathematical models of Earth's climate confirming these predictions?
• Is Earth's climate a stable physical phenomenon and what do the mathematical models tell us about its stability?
We will only discuss questions concerning the mathematical models and the factor of stability that are in the field of author’s expertise.
First, if Earth's climate has not been well stable and robust, the life on our planet stopped existing many thousands years ago because of planetary and cosmic cataclysms that from time to time were happening to Earth. In the mathematical theory of stability the term of robust means the stability when the parameters of a system under study itself change. Sheer stability involves only the change of initial conditions of the system and external disturbances.
Second, to construct a well adequate mathematical model of Earth's climate is a tremendously complicated problem. In my opinion, it should be governed by a high-dimensional system consisting of essentially nonlinear partial differential equations, ordinary differential equations and functional equations with uncertain or stochastic parameters and unpredictable external and internal disturbances. What do we mean when we say well adequate mathematical model or theory? It means that the difference between the results of experimental measurements and the results predicted by the model or theory is acceptably small to be neglected. For comparison with the well adequate theory see the prediction of the value of Dirac constant made by Quantum Electrodynamics: the experimental value of 1.00115965221 versus the theoretical value of 1.00115965246. (The Strange Theory of Light and Matter by Richard P. Feynman, 1988). As we have said before to construct such a complicated model is terribly difficult problem but what is much worse is to investigate the stability of this model. This problem is one hundred times harder than the previous one. Cherishing hopes for “omnipotent” computer technologies (hardware and software) that will help get us out of the difficult situation will be disappointing. The output of computer simulation strongly depends on its input. If you have a badly designed or wrong algorithm that processes your equations or bad input data, you will definitely receive bad, practically useless or nonsensical output data, if any. Furthermore, computer simulation is not so reliable and credible as it seems to be. It transpires that the computer software based on discrete mathematics does not always preserve the very important or even fundamental characteristics of real (continuous) time dynamical systems. For instance, the one might turn conservative systems into dissipative ones, break limit cycles, not be able to finish the process of the integration of differential equations to an acceptable extent due to weak convergency that makes the results of simulation uninterpretable, etc. Thus, computer simulation cannot be done successfully without good preliminary theoretical analysis. Researchers should foresee its possible results to some degree before starting simulating. They should understand well what they do. “There is nothing more practical than a good theory.” said James C. Maxwell. However, here is the stumbling-block called a good theory. It has yet to be developed for such complicated mathematical models!
But there is good news. Basing on multiple empirical studies and our history, we can say with a full certainty that
• The physical phenomenon of Earth’s climate is definitely well stable and robust. There are explicit manifestations of this fact revealing that stability of the structure of Earth’s climate is organized through the complex network of multiple local and global negative feedback loops covering the globe. Our planet is deftly designed to harbor and protect various forms of life including Humankind. Can people inflict the fatal damage on the planet and its climate resulting in the total disappearance of all living organisms and making Earth unfit for human habitation any longer? It is very unlikely at least today. What our civilization is able to do now is to destroy itself. But our place on Earth will be taken by another civilization of human race. Nature abhors a vacuum. Our foolproof planet is supposed to get rid of bad and unintelligent ones in order to ultimately preserve the most valuable thing in the Universe, the life.
• We have to deal with stability of the natural phenomenon that has an intrinsically oscillating complexion with occasional considerable fluctuations of the quantities describing it in space-time.
• Humanity should make all the possible efforts to minimize the deleterious effect of its activity on Earth's climate.
5. CONCLUSION
To sum up, we can say that God created the world infused with stability but not all formations and forms of the matter are so blessed to be endowed with it. However, the following observation definitely reflects the reality: His divine intervention often manifests itself through the astonishing activation of this supernatural property.
P.S. All the said above is the personal opinion of the author. The objective of this post is not to support or refute any theories, statements, ideas, and hypotheses. The only goal, the author has pursued here, is to make the concept of stability taken seriously not only by international scientific communities but any reader too.
In general, the principle or relativity may be stated as the independency of a law from the observers. By an observer we mean that a system which is competent to verify the law. The law may belong to any subject.
As an example, the special relativistic formulation of the law governing the portfolio risk of two security case has bee discussed in 'Role of the principle of relativity.
I think the aim of the human life is to unveil the purpose of the life. We all had come to this planet with a divine possibility and the purpose of our lives, the quest of the superior notion, the thirst for higher truth, the hunger for inner freedom shall never be forgotten and that alone, is the only real reason for the existence. To have self-realisation, to discover yourself is the ultimate purpose for the human beings as well as for the humanity. I need your valuable insights in this regard.
Flusser acknowledges many times the influence of Husserl and Heidegger on his thinking, but then he goes on to explain that, for him, phenomenology is about the disappearance of the subject-object categories and replacing them with a dual-pole relation. See for example his essay on Edmund Husserl published in the special issue of Intellect (2011) where Flusser says: ''It can be shown that it [knowledge] is a dynamic relation, a sort of arrow. It points from somewhere (a supposed subject), to somewhere (a supposed object). It is ‘intentional’. I can call the point to which it intends, from a. ‘subject’, and -the point to which it intends, to an ‘object’." {Flusser 2012 #338D: 235}
This is an explicit account of intentionality but, in his later philosophy, Flusser does not mention intentionality, yet he subtitles at least two of his books as 'a phenomenology of... gestures/ media'.
In short, what makes Flusser's media work phenomenological insofar as he does not speak of intentionalities there?
The definition psychology has for habituation, and its origination is the tendency to have decreased responsiveness to something. For that matter “Something that is new and incredibly exciting can become annoying.” We all have agreed that consciousness forms memories and, vice-versa, memories are a proof of being conscious.
Therefore, on the one hand, we strive to be conscious and get as many active memories from our lives as we can, and, on the other, we unconscious but ontologically need to get rid of our conscious acts by creating habituations from everything all the time and led all our deeds deep into sub/un-conscious level of our psyche.
Re: Schmeikal, B. 2016a. Basic Intelligence Processing Space. Journal of Space Philosophy 5, no. 1 (Spring 2016). https://www.researchgate.net/publication/303282613_Basic_Intelligence_Processing_Space
In my philosophical conceptual work I employ my version of the Andre-Weil-Claude-Levi-Strauss canonical group transformation formula (rCF), applied to conceptual fields in same way applied to mythological fields in comparative mythology. In my view, and you have given me words to more clearly articulate this: the rCF is a generative structure for an intelligent processing of energy. If so, concepts have energy. A surprising result. Concepts are processed. The philosopher Deleuze once said that the 'conceptual operator operates the conceptual machinery of any philosophy'. I am not a mathematician. It is perhaps the case that the rCF is a "commutative algebra within non-commutative space" (Schmeikal 2016a: 16). A mathematician would have to look at my version of the rCF to determine if it is such?
Re: Schmeikal, B. 2016b. On Consciousness & Consciousness Logging Off Consciousness.
Thinking of the energetics aspect of intelligence processing, I would say that the Weil-Levi-Strauss rCF's fourfold permutations (two pairs of binary opposites permuted four ways) undergo transformation into eight inverses. These eight might be termed eight transcendences. To generate the inverses requires imagination, though an imagination constrained by the overall formula. After reading your paper, I now am happy to refer to these inverses as 'unbinding, a release of free energy' (Schmeikal 2016b: 21, 28). I am not a mathematician, only a fool or a poet, as Nietzsche once said, but I wonder is this rCF an example of the Clifford algebras you talk about in your papers?
Article On Consciousness
Digital transformation is mere dream and infinite miles away from reality in India! What do you think?
Three scientists sit at a bar, overlooking an apartment building. Before the first beer, they notice two persons entering the building. A few hours and beers later, they notice that three persons leave the building. Now they all get agitated, because they all need to explain what has happened.
The biologist claims that the two have mated, and the third specimen is the result.
The physicist claims that what they all witnessed had a major measurement error, and that in fact there must have been three entering the building earlier.
The mathematician smiles. She knows the right answer, and spells it out: "Don't worry. When the next person enters the building, it will become empty."
Dear people,
I'm confused about the general interpretation between determinism and quantum mechanics.
Over 200 years ago Laplace claimed, that if an entity (he called it the demon) would know the position and impulse of all particles that were created during BBN, this entity could with enough computing power predict the whole future of the cosmos. Of course Heisenberg added correctly, that it is not possible to determine both properties (position and impulse) accurately at the same time (besides Heisenbergs approach there are several different argumentations against Laplace's Determinism).
But why do we then argue from Heisenbergs Point of View, that the universe at all ist not absolutely deterministic? Just because we (and our technology) are not able to measure both properties at the same time, must we then conclude that a particle could not have a certain position AND a certain impulse at one and the same time? Why do we link the measurement (of course not necessarily in form of a human observer) with the fundamental reality of particles? Especially when arguing that cosmos itself could be considered as the calculating Demon?
What hint am i missing? Can anyone help?
Thank you!
If the dominant tendencies of analytical theories (serial theories, Set Theory and transformational theories) today revolve around concepts and principles from the hard sciences, and in contrast their researchers are not freed from this mathematical model, what theoretical-analytical, indeed innovative, are being offered to Musicology from these tangent areas?
People believe what they saw but the reality is something else .
we all have our own space time plane and fill good or bad only due to its motion (since multiverse is moving )as it is moving but than why we feel differently form others those who have the same space time plane?
That there is something outside of our minds is undeniable and what we see,too. But is what we see exactly what there is outside? I see a computer in front of me but is this computer a picture of a computer outside? or there is something else outside and I see this picture which we call it a computer?
What does it take to do phenomenal inventions?? What researchers must have in their minds?
This theme is meant to provoke us again to an interdisciplinary debate in the Dialogo interdisciplinary project where fields as (but not limited to) NeuroSciences, Robotics, Cognitive Science, Sociology, Psychology, Ethics, and Theology, which should deliberate whether or not:
Is the link between the Intelligence and the conscious Self situated in the brain?, or in the Soul? How can Neuroscience define what Soul is? Is it even possible for it to accept such term, or it cannot encompass this term/existance in the AI theories? If the brain is not the cradle for the Soul, then can we say that AI can also bear a Soul? Can we have a Soul without intelligence or conscious Self?
A plant is made up of innumerable systems separated by cellulose cell walls, which essentially give the plant it's shape. My question is aimed at knowing if these systems are using gravity in following a growth plan that help the plant, amongst all other functions, stand with exact positioning of the centre of gravity.
An icon is a sign which represents its object through a qualitative similarity, likeness, or resemblance. As such an icon seems to imply a spectrum of continuous qualitative variation between the icon and the objects it represents through this likeness or resemblance. In this sense, an icon is similar to a general term which, in Aristotle's definition, is "that which by its nature is predicated of a number of things" De Interpretatione VII (see EP 2.208 for Peirce's approval of this definition). Just as there is no limit to the number of similar objects an icon might represent, so there is no limit to the number of possible objects which a general term like "sun" might be predicated of: "Take any two possible objects that might be called suns and however much alike they may be, any multitude whatsoever of intermediate suns are alternatively possible and therefore . . . these intermediate possible suns transcend all multitude. In short, the idea of a general involves the idea of possible variations which no multitude of existent things could exhaust but would leave between any two not merely many possibilities, but possibilities absolutely beyond all multitude" (EP 2.183). Though there are many passages that suggest a connection between iconic representation and logical generality, I cannot find any explicit discussion of this connection in Peirce's writings. How do others interpret this connection? Is there any passage where Peirce addresses this question directly?
The preceding question is intimately related to at least two other key doctrines of Peirce. The first is Peirce's scholastic realism which insists that the generality of the general terms in a true proposition must correspond to some objective generality--some lawfulness, regularity, or thirdness in nature. The second doctrine is Peirce's synechism--his emphasis upon real continuity in nature or "the tendency to regard everything as continuous" (EP 2.1). I'm curious to see how other Peirce scholars draw the connections between these ideas, and to see whether they have any suggestions concerning helpful secondary literature.
Every theory of everything must be so complicated that it cannot be captured by the human mind. Reality appears to possess a fundamental structure that keeps it relatively coherent and prevents that reality turns into complete chaos. This foundation acts as a kind of DNA that predestinates the evolution of the foundation into a more complicated structure. The structure of the foundation will be rather simple and it is quite probable that current mathematics already contains similar structures. These simple structures are easily comprehensible by skilled scientists. However, what is not so straightforward is the fact that these structures restrict their extension into higher level structures that preserve coherence. It is quite possible that these structures only partly achieve this target and that extra measures must be added in order to achieve sufficient coherence. If this occurs, then what mechanism installs these extra measures and why do these mechanisms exist?
The Schrodinger experiment (intended to illustrate what he thought was the implausibility of a half-live half-dead cat state function, but now taken seriously by many) is modified to examine the question of whether physical processes collapse the wave function, or whether consciousness is required as I understand von Neumann suspected.
The AI (artificial intelligence) is not assumed to be conscious, just a sophisticated but deterministic program, or expert system, with motors attached robot-like. We assume from quantum mechanics calculations that the room contains a state function which is a 50-50 live-cat, dead-cat. When we open the room we expect to find one of the following:
- Live cat, with AI having recorded an observation of opening the smaller box and finding a live cat.
- Dead cat, with AI having recorded an observation of opening the smaller box and finding a dead cat.
There is nothing to collapse the wavefunction until you and I open the box, according to von Neumann. As I understand him. The AI is a physical process, just like the cat's internal biological processes are physical, and if the cat itself doesn't collapse the wave function, neither can the AI.
However, notice that the AI has the same subjective experiences that we do. There is no cross-state mixing between the AI and the cat. The AI which found the live cat never mixes with the dead cat state, and vice versa.
There, in an interview with the AI, it will insist that it never found any contradiction to the notion that it collapsed the wave function, even though our mathematics informs us otherwise.
Lord Rutherford said to his students at the end of XIX century: „All science is either physics or stamp collecting“. „Qualitative is nothing than poor quantitative."
The biology up to our times still depends upon the physics in its bases both experimental and theoretical. This vision and application of science comes from English empiricism. The primary/secondary quality distinction is a conceptual distinction in epistemology and metaphysics, concerning the nature of reality. It is most explicitly articulated by John Locke in his Essay concerning Human Understanding.
In Chapter V, of The Nature of the Physical World, Arthur Eddington, wrote as follows:
Linkage of Entropy with Becoming. When you say to yourself, “Every day I grow better and better,” science churlishly replies—
“I see no signs of it. I see you extended as a four-dimensional worm in space-time; and, although goodness is not strictly within my province, I will grant that one end of you is better than the other. But whether you grow better or worse depends on which way up I hold you. There is in your consciousness an idea of growth or ‘becoming’ which, if it is not illusory, implies that you have a label ‘This side up.’ I have searched for such a label all through the physical world and can find no trace of it, so I strongly suspect that the label is non-existent in the world of reality.”
That is the reply of science comprised in primary law. Taking account of secondary law, the reply is modified a little, though it is still none too gracious—
“I have looked again and, in the course of studying a property called entropy, I find that the physical world is marked with an arrow which may possibly be intended to indicate which way up it should be regarded. With that orientation I find that you really do grow better. Or, to speak precisely, your good end is in the part of the world with most entropy and your bad end in the part with least. Why this arrangement should be considered more creditable than that of your neighbor who has his good and bad ends the other way round, I cannot imagine.”
See:
The Cambridge philosopher, Huw Price provides an very engaging contemporary discussion of this topic in the following short video of his 2011 lecture (27 Min.):
This is well worth a viewing. Price has claimed that the ordinary or common-sense conception of time is "subjective" partly by including an emphatic distinction between past and future, the idea of "becoming" in time, or a notion of time "flowing." The argument arises from the temporal symmetry of the laws of fundamental physics --in some contrast and tension with the second law of thermodynamics. So we want to know if "becoming" in particular is merely "subjective," and whether this follows on the basis of fundamental physics.
Chapter Eddington, Chapter V "Becoming"
What is Creatures and Creators Matrix?
Behind all the activities there are mindsets………And apparently mindset have mind at backend.........Seemingly Artificial Intelligence is not so well developed yet to produce mindsets without original biological alive brain present inside the body......wondering what will happen when Artificial Intelligence will produce mindset by own.............!!
Creatures and creators will learn definitely from each other...............about that scenario here are few questions
What do you think which directions the future AI based mindset can adopt and why?
What can be the basics and further developmental requirements of such mindsets?
And up till what extent such mind set can go to fulfill the desires?
What will be the factors which can have effects on it?
How good and bad will exist for AI based mind sets and what will be expected frame of reference (s)?
It can be foreseen that if space exploration don’t give them (AI based mindsets) the way out then a tough match will get play at Earth and In case of way out among cosmos then Nature will get a very strange events waves might be beyond space-time-relativity, if so then will natural laws get evolution, if again yes then in which direction and what will be the future matrix of Creatures and Creators?
Thanks
Please Join the MULTIDISCIPLINARY group:
Ecology & Economics and Non-Monetary Values. The Role of States and Governments
WE need contributions from ALL EXPERTS / PROFESSIONS
Please Go to the LINK, JOIN IN and place / POST your comments ON:
See you on the group! - See perhaps your contributions as well.
Thanks!
I have a pretty good understanding of the first 10 dimensions, but get very confused once it gets up to 12. Does anyone have some good info on the subject that they could explain or at least send some links?
The study of the philosophy of spirituality in the formal field of nursing has brought this to my attention. I am aware of transcendence possibly being a key part of the philosophy of spirituality in nursing. Emerson and Kant comment on transcendence but each view is different; the effect of time I speculate. So if time indeed has an effect on the evolution of the meaning of transcendence, what then does it mean today?
The term nihilism is often used in combination with an ‘anomie’ to explain a general feeling of despair under a perception that the existence has no purpose, realizing that there is no need for rules, regulations and laws. (‘anomie’ is a state of cognitive dissonance between the normative expectations and the reality as experienced).
Movements such as Futurism and deconstruction, along with many others were often identified by many as "nihilistic".
Nihilism also assumes different characteristics depending on the historical context in which it fits, for example, sometimes postmodernism has been defined as an nihilist age, and figures of religious authority have often argued that postmodernism and various aspects of modernity, have the rejection of theism, and the non-acceptance of theistic doctrines is one of the cornerstones of nihilism.
Nihilism in itself can be divided according to different definitions and their recurrence is useful to describe philosophical positions that are independent and disjointed, although sometimes is possible a correlation or a consequentiality between the one and the other.
The metaphysical nihilism is a philosophical theory according to which "it is possible" that there are no objective realities in their entirety, or more theoretically, it is believed that there is a hypothetical world in which there are none; at the most that can not exist "concrete" objective realities ; so if each possible word contains objects, there is at least one that contains abstract entities.
An extreme form of metaphysical nihilism is commonly defined as the belief that there is no part of a world self-sufficient. One way to interpret such a statement might be: "It is impossible to distinguish the existence from non-existence, since these two concepts do not have the objective characteristics defined, and an element of truth in that statement can have, so to find a difference between the two. "If there is something that can discern the meaning of" existence "by its negation, the concept of existence has no meaning; or in other words, there is no intrinsic value. The term "meaning" in this sense is used to say that as existence does not have a high level of "reality", existence in itself means nothing. You could say that this belief, combined with the epistemological nihilism, would result in the idea that nothing can be defined as real or true, since these parameters do not exist.
The epistemological form of nihilism can be seen as an extreme skepticism, where every form of knowledge is denied.
Mereological nihilism (also called compositional nihilism) is the position whereby there are no organizations with their identity (not only in space but also in time), but institutions without identity - also known as "building blocks" - and the world as we perceive and experience it and in which we believe there are these entities with identity, are only a product of the fallacy of human perceptions.
The moral nihilism, also known as ethical nihilism, is a meta-ethics that supports the non-existence of morality as objective reality; there is therefore no action that is necessarily preferable to another. For example, a moral nihilist would say that killing a person, for whatever reason, is not inherently neither right nor wrong. Other nihilists could even say that there is no morality, and if this exists, is a human invention, and then an artificial construction, in which each sense is relative depending on the different possible consequences. For example, if someone kills a person, a nihilist might argue that killing is not necessarily wrong, regardless of our moral principles: that is only because morality is constructed as a rudimentary dichotomy, in which it is stated that a bad thing has a weight far more serious than anything defined as a positive result, killing someone is wrong because it does not let the opportunity to this person to live. To his living is arbitrarily given a positive sense. In this way, a moral nihilist believes that all ethical statements are false.
The political nihilism is a branch that follows the characteristic points of the nihilistic philosophy, as the rejection of non-rationalized or non-proven institutions: in this case, the most important social and political structures, such as government, family and laws. The Nihilist movement exhibited a similar doctrine in the nineteenth century in Russia. The political nihilism is a school of thought quite different from the forms of nihilism, and is often regarded more as a form of utilitarianism.
With Friedrich Nietzsche the phenomenon of nihilism takes on the ambiguity and ambivalence of real figure of interpretation, both theoretical and practical, of Western civilization. In a more explicit negative sense, it is described as a sign of the times, a sign of the decline faced by civilization. At the same time, positively, the twilight of values and idols "with feet of clay" that dominated the history of the West, and then as a whole there is the announcement of a new "dawn" , the prophecy of a new era, which will rise from the ashes of the dead man as it historically has given, and the God that he has built in his own image and likeness. Prophet and interpreter of this new era will therefore will be no longer man, but a kind of mythical figure, designated as the Superman, able to take upon himself the profound sense of nihilism and overcome it, knowing the author and creator of new values.
In Nietzsche, therefore, the word nihilism designates the essence of the crisis affecting the modern European civilization: for Nietzsche nihilism is an event that brings decadence and disorientation, so as to constitute a kind of disease by which the modern world is affected; the disease would lead to the disintegration of the moral subject, to the debilitation of the will and the loss of the ultimate goal of life (passive nihilism).
To this condition would follow, according to Nietzsche, a resurgence of the human legislative will and an overcoming of the disease condition through a multifaceted appreciation of existence (active nihilism) free of any claim to absolute truth. Ontological foundation of nihilism is the "death of God", a symbol of the loss of each landmark and greatest revelation of the universal ‘nothing’.
Philosopher Emanuele Severino writes that the modern vision of nihilism is wrongly based on the concept of ‘being’ born from nothing, exists, then returns to nothingness.
As observed by philosopher Diego Fusaro, "for Severino everything is eternal. Not enough: only on the surface it is believed that things come out of ‘nothing’ and in the end in ‘nothing’ precipitate, because in the deep down we believe that the short segment of light that is life itself is nothing. It is nihilism. It is the primary murder, the killing of ‘being’. But it is a contradiction: what is, cannot be ‘not-being’, or may not have been or will ever be ‘nothing’. This contradiction is the folly of the West, and now of all the earth. A wound that needs many comforts, from religion to art, all frescoes on the dark, attempts to hide, medicate the ‘nothing’ that horrifies us. Hence the search for stronger ethics, founded on truth and human dignity.
Luckily the Non-Folly waits for us, the appearance of the eternity of all things. We are eternal and deadly because the eternal enters and exits from appearing. Death is the absence of eternity. We all have nihilism in blood. (...) Everything is eternal means that every moment of reality ‘is’, that is it does not go out and does not return in ‘nothing’, it means that even to things and events most humble and impalpable competes the triumph that is usually reserved to God. "
see Lawrence Cahoone http://www.philosophy.uncc.edu/mleldrid/SAAP/CLT/P07R.htm
"Towards A New Metaphysics of Natural Complexes"
I plan to outline a research program aimed at developing a new natural science of macro-economics (called Catallactics) based on the idea that the modern economy is a natural but non-physical far-from-equilibrium system of exchange.
Naturalism is a philosophical doctrine according to which nature is, directly or indirectly, the primary object of philosophical inquiry.
According to naturalism, reality can be understood solely or primarily through natural laws, without resorting to the principles of a transcendental or spiritual order. Naturalism could therefore be understood as a synonym for materialism in opposition to spiritualism and idealism.
Then, according to the Encyclopedia ‘Sapere.it ’naturalism’ is a term common to the streams of thought that consider nature, in all its aspects, not only as a fundamental object of philosophical reflection, but also, and above all, as a benchmark determinant and absolute in terms of lives and interests of man. In particular there is a metaphysical naturalism, sociological, aesthetic, ethical and pedagogical.
The most radical philosophical form is given by the metaphysical naturalism, tending to see in nature the first principle of all things, as at the dawn of Greek speculation and then again with the Stoics and in a great part of the Renaissance thought.
Naturalism was in crisis as a result not only of sophistry, but above all because of the Socrates reflection, focused mainly on man and on issues of ethics and existence.
Re-launched and revived over the centuries, the Renaissance is the organizer of a humanistic vision that exalts freedom and human dignity while promoting a recovery of naturalism as autonomous reflection on nature.
But even neo-Platonism is dedicated to the study of nature, giving rise to natural philosophy, through formulae or intelligible process to use as a key to deciphering the various natural mysteries, thus granting man an unlimited power over nature.
As in the early Greek philosophers, the world is interpreted with a monistic view, with no more opposition between spirit and matter: nature is again treated as a single living organism, in which the life-giving breath or Anima mundi does not work assembling small parts until reaching the higher and intelligent organisms (atomism), but just the opposite: the evolution of nature is made possible by the intelligent principle that already exists prior to matter. It reaffirms the need to study nature according to its own principles, that is, according to the typical Aristotelian vision of a reason immanent in organism.
The contemporary naturalism includes extreme forms according to which science should replace philosophy.
A significant exponent is Willard Van Orman Quine, considered one of the greatest physicalist philosophers of the twentieth century, according to whom ‘reality is identified and described in the science and not in the domain of any philosophy. "
Giancarlo Zanet, a researcher in Philosophy, in his publication: "The roots of naturalism: WV Quine between empiricist legacy and pragmatism" explains Quine's philosophy that "... ..is located in the center of the philosophical scene of the second half of the twentieth century constituting at the same time the landing and turning point of the empiricist philosophy tradition, both in its pragmatist declination and in the neo-positivist one. Quine, in fact, submitted the empiricist tradition to a thorough review that, starting from the critics to the two dogmas of empiricism, has landed in the formulation of a theoretical proposition, naturalism, which rightly can be considered by Habermas as one of the very predominant theoretical strategies "in the philosophical landscape today".
It is interesting to read the review of the book by Professor Achille Varzi edikted by Evandro Agazzi and Nicola Vassallo," Introduction to contemporary philosophical naturalism" (1998).
Achille Varzi writes: "When we talk today about naturalism it is generally referred to a program of naturalization of philosophy that was launched (or relaunched) by Quine.
For Quine, epistemology was not an isolated domain. The program aimed at overcoming any clear separation between philosophical and scientific inquiry, in favor of a complete continuity of method and content. "Knowledge, mind and meaning - Quine stated in another text of those years - are part of the same world with which they are dealing and must be studied in the same empirical spirit that animates natural science." Since then, under the more or less direct influence of Quine and other epistemologists as Alvin Goldman (whose 'Causal Theory of Knowledge' dates in 1967) or Fred Dretske (Seeing is Knowing of 1969), programs for naturalization have extended quickly and massively and today we can say that there are no areas of philosophical research in which the debate on naturalism does not occupy a position of great importance
Unfortunately, this rapid expansion is paralleled by a marked multiplication of perspectives, so that today it is difficult to speak of "naturalism" in a unique way. There are various forms of naturalism, the radical, for which philosophy must literally merge (until it disappears) in the natural sciences, and there is a moderate naturalism, for which philosophy must rely on the contribution of science while maintaining its own specificity.
The conceptual expansion was rapid but also very uneven. To bring order to this landscape so varied and complex is one of the stated aims of the book edited by Agazzi and Vassallo. Overall, it appears a fairly skeptical ‘picture’ of naturalism, characterized more by the obstacles than by the prospects for development.
Thus naturalism is a doctrine quite different from empiricism.
Empiricism, in fact, is a philosophical position according to which experience is the only legitimate source of knowledge. The definition highlights how empiricism is an epistemological doctrine and should not be confused with other philosophical theses, such as naturalism, which has assumptions about what the reality is and not the way in which it is known.
Richard Rorty, in 'One who separated naturalism from empiricism', referring to Wittgenstein writes: "He is a thinker who, in his later works, has helped us to achieve one of the main philosophical progress of recent times: the separation between naturalism and empiricism.
Naturalism is a good idea – Wittgenstein said. It means considering human beings as products of biological evolution, without a mysterious intangible component, such as an immortal soul, or the Cartesian ego. Being naturalists means taking Darwin seriously, and interpret the interest of men for truth and goodness as part of the attempts of some biological species (a species that has been blessed with an unusually complex neural network) to respond to their needs.
Contemporary naturalists insist that what makes us human is the ability to use language, to exchange signs and noises (first in order to collaborate on projects of a practical type, later to create a superior culture), more than possession of an extra ingredient, which animals lack.
In the Wittgenstein words Empiricism is – instead - a bad idea. It is to think that all our knowledge is just a matter of "processing of sensory information."
Founded by Locke and Hume, Empiricism has little to do with naturalism and everything to do with the hopelessly outdated Cartesian image of mind as mysterious inner theater where the "ideas" are projected on a screen , in front of an equally mysterious immaterial spectator.
If we could get rid of this image, we would not put forth the most terrible kind of question which is impossible to answer, such as: "The image of the world that we build elaborating conceptually sensory impressions, is really like the world that gives us these impressions?" or: "the fact that you and I use the same language means that in our minds we have the same ideas, or maybe when you say 'purple' you mean what I call 'red', and vice versa, because our color spectrum is reversed? "; Or again: "A blind person from birth meant by the term 'red' the same thing as we understand it?".
The scientific experience has a broader conception of the traditional one because it includes both the direct understanding, the immediately observable in its evidence from sensitive topic, and the indirect one, apparent from data that can not fall within the common sensibility, such as those concerning the cosmological or subatomic phenomena, but which originate from other established and verified observations, linked to this type of phenomena.
Experience used in science in addition to common observation is then "artificial" intervention of the scientist who organizes sensitive data inserting them into schemes of statistical nature, as in 'experientia litterata' of Francis Bacon made orderly writing data in 'tabulae', or that through the experiment, as in Galileo, driving natural phenomena to the demonstration of a theory.
In this way the concept of experience greatly expanded which in addition to conventional sensory and emotional factors today includes logical, mathematical and technological factors that renders more complex the epistemological interpretation.
In the history of thought the main problem, once gained confidence in empirical data drawn from reason, was to determine how the acquired knowledge could be attributed to experience or to reason.
According to the empiricists that of the intellect would be an empty and inconclusive activity with no empirical data due to the sensitive reception. It was necessary, however, to distinguish the primary and immediate elements of experience, feelings and impressions, from those relationships between the sensitive data that serve to organize and sort them and without which the empirical data would be a chaotic mixture of sensations.
This aspect of the relationships that determine the ordered structure of experience was analyzed in detail by John Locke and David Hume and became central to modern epistemology which poses the question of whether those relationships simply result from an accumulation of pure sensitive data that cause, in the end, the order of experience, as argued by the sensism or positivist materialism, or whether it is rationality which, intervening predominantly, establishes that order, as it was in the doctrines of Leibniz, of idealism and spiritualism of the late nineteenth century.
With the establishment and spreading of evolutionary theory of Darwin the problem of the relationship between experience and reason became complicated with the new question of the origin and development of the human spirit. Two theories opposed each other: the naturalistic one, headed by Spencer, according to which even those that are considered to be innate properties of the intellect are in fact the result of a natural evolution, and the historicist one, that comes with Hegel, according to which the human spirit is born and grows depending on the historical conditions in which it lives and works.
Galileo and, before him, Copernicus managed to convince the world that experience of reality requires a critical attitude, as it in itself is not something identical to the world of objects. It is true that experience is the touchstone of the theory, however now the everyday life, to be true, must be transformed into scientific experience. And this transformation must follow basic guidelines: before deciding on the 'why' you have to answer the question of 'how'. To do this you must set up the structure of experimental situations in which the observation of the phenomena at a 'pure' state is possible. The data of experience are used to formulate hypotheses about the fundamental configuration of reality, usually expressed in mathematical language
According to the view of scientists of Galilean formation, experience is not the basis from which it is possible to derive the fundamental truth of a theory, because it can always deceive. Experience and then the experiment can 'suggest' at best new ideas, while their main function is to be tools of verification of the theory by comparing its ultimate consequences with the empirical data.
The need to find a unifying principle for all knowledge, an original synthesis meant as an ‘a priori’ representation of all a man knows and as such precedes the consciousness itself of multiplicity, leads Kant to elaborate the doctrine of '' I think ', which is one of the most debated and significant point of his whole philosophy.
The different representations of my intellect are unified in the horizon of what I thought, because they are accompanied by the awareness that I think about them. The ‘ I think’ is therefore the supreme principle of all synthesis, i.e. the horizon which the synthesis made by the categories connect in a unified manner, and as well the principle of every knowledge whereby the mind is conscious of the created unification. The principle makes it possible a real unitary knowledge of reality and at the same time it takes root in the awareness of the constitutive human finitude: it is worth noting that, in this sense, the ‘ I think’ is an organizing principle, a transcendental structure that "must accompany" the representations of the subject, and not the principle from which the whole reality depends, as it will be understood later by idealist thinkers .
Fichte, for example, in a letter of 1793, would say of Kant, "this unique thinker becomes to me increasingly marvelous: I think he has a genius that shows him the truth, but without revealing the fundamentals." However, on his part, Kant is much careful to point out how the ‘I think’ is the structure of thinking of each empirical subject, and then as it does not coincide nor - in the wake of Descartes - with an ‘individual I’ object of immediate self-consciousness, nor - as suggested by Spinoza and taken by idealists - with the ‘absolute I’ that is the foundation of all finite consciousness.
Specifically, the problem that Kant sought to resolve, which he addressed in the transcendental deduction of the Critique of Pure Reason, was as follows: why nature seems to follow necessary laws by conforming to those of our intellect? By what right do the latter can say to know scientifically the nature, "establishing" the laws in one way rather than another?
According to Kant, such a right is justified because the foundation of our knowledge is not in the nature but in the activity itself of the subject.
Really there is non-dynamic mass?
Bernhard of Chartres says in his Glosses on Plato, that such an ideal state cannot exist in this world. Is this now his own opinion, or does he refer to Republic IX 592ab? Because: As far as I know there was no copy of the Republic in his time, only Calcidius' Timaeus. So how could he refer to the Republic in such a detailed way?
When I was in high school Bohr's atom of shells, s and p orbitals was introduced in chemistry. Realization was automatic that the world was explained according to theory that was verified by experiment. Through college and graduate school, looking for more complete explanation, theory is challanged but it is not brought to question "what is an electron or proton, if they have mass but are visible only in the sense that they emit light energy as photons that also have mass, "spots of light in orbit around nuclei?, the atom a solar system in minature"? Physicists will say this is not the picture they have evolved, but all that remains is the image of equations on a chalkboard, at best 'the image of things of a particle nature in alteration with things of a light nature'. Can a pieced-together stepwise reality of this nature be accepted? In the Feyman quote below pieces are added that can break any of the established laws "they are not directly observeable" or affect "causality". In this same meaning though neither electrons, protons, photons or atoms are observable and their causal effects are but a matter of humanly constructed theory and similarly based experimental apparatus. The possibility exists that theory and theory based apparatus entail one another and all that might be gotten is that the real universe is identical in this respect...i.e. existence entails the experienced universe and visa-verse.
"You found out in the last lecture that light doesn't go only in straight lines; now, you find out that it doesn't go only at the speed of light! It may surprise you that there is an amplitude for a photon to go at speeds faster or slower than the conventional speed, c." These virtual photons, however, do not violate causality or special relativity, as they are not directly observable and information cannot be transmitted causally in the theory." (from "Varying c in quantum theory" http://www.researchgate.net/go.Deref.html?url=http%3A%2F%2Fen.wikipedia.org%2Fwiki%2FVariable_speed_of_light)
Currently mathematics uses the Real Numbers to define a continuum - as in the Real Number line.
If so much of physics makes use of Complex Numbers, why isn't there a Complex Continuum defining space?
Quaternions would seem to define a space where each spatial axis is complex (only the 'temporal' axis remains 'Real': Q = w +ix+jy +kz).
This would suggest that we are using two different models for space - a Real continuum and a Complex continuum model for spatial axes.
If this is true, then we should expect difficulties when crossing between these models.
Surrealism, as an aesthetic movement in the creative arts, as a means to 'extend the reality', challenging the normative mode of appreciating the reality, has contributed tremendously in different forms of expressions during the twentieth century. Salvador Dali, the great Spanish painter; Luis Bunuel, the famous film maker and many prominent figures subscribed surrealism. The basic elements responsible for the growth of surrealism like questioning the reality, the existing belief systems are also essential to promote research minds. However, we seldom come across scientists influenced by surrealism. Why is it so? Is there any fundamental contradiction?
We often come across the term super-Natural. The term is quite confusing as well as complicated. A lot of literature does exist and support the super natural existence but there is no solid scientific proof or proof of concept. So much research is still in hand about the existence / experiences of super natural. This term could be included in the list of things that science cannot prove? Is it possible that science could ever enter into dimensions of super natural. What are your views , experiences and comments...
Some contemporary theories appear to create “sinkholes” in the extrapolation process toward the more fundamental. Special Relativity expresses an equivalency between matter and energy. The question “Is matter really energy condensed?” posed by Marcus Borges illustrates this conundrum. Condensation is often applied to situations where energy among matter components is expelled. The enigma is intensified when experiments are interpreted to indicate the creation of charged particles from photons, i.e. electrons and positrons. Where do charges lurk within energy? Quantum Mechanics presents dual personalities for bodies of matter; i.e. wavelike versus particulate. The question “What are valid interpretations of the quantum double slit experiment?” asked by Vang Lee illustrates this conundrum. A pathway that connects Relativity with Quantum Mechanics has not been established.
In various niches of the scientific realm components and properties are tailored to accommodate conceptual visions (theories). Matter distorts space-time in one niche while it exchanges gravitons in another niche to mediate gravitational effects. Some particles, including gravitons, are proposed to be massless. The gravitational effects of black holes supposedly do not allow the escape of photons. Do black holes exchange gravitons?
Contemporary theories as a result of their abstruse nature defy attempts at a consistent visualization. If one had a grasp of the ultimate components of a system, it should be possible, in theory, to envision a structure for the system that accounts for the phenomena as detected at the observational level and to explain the utility of theories. Where does one start? Initially it is proposed that individuals attempt to provide candidates for the ultimate components based on their perspectives. Since the musings of Democritus, storehouses of scientific observations have been accumulated that provide a background of information available for interpretation and reinterpretation. The objective is to reduce the “sinkholes” in the landscape of our scientific endeavors.
A proposal for the ultimate components is presented under William Blackmon at Researchgate.net. It has been a solo venture and criticism would be appreciated.
I have just been reading about Leibniz and I wonder if anyone can help me by telling me where Leibniz said "wholes have only a borrowed reality - borrowed from the reality of their parts". I am not sure whether this an exact quotation or an approximation.
I find myself going in a paradoxical loop when I think about the distinctions. Insofar that it seems that the two need each other instead of one being valid over another.
For example, let us begin by accepting Kant's refutation of t.realism. T. idealism allows us to demarcate between noumena and phenomena. The phenomena is of an empirical idealist existence. Yet my question is, does not the intersubjectivity constituted out of empirical idealism create a type of transcendental realism? As soon as he puts the thought to paper, and write a symbol to be interpreted by another, does he not instantiate an existence that he previously refuted?
our universe is fully created with energy and all other things is only the change in that energy a/c to different dimension only
Critical realism proceeds from the premise that in order to be a coherent form of enquiry the natural sciences presuppose that there is a material reality which is the object of enquiry. Scientific propositions, it is argued, are true if they correspond to the reality that they purport to describe or explain. But the critical realist argues that explanations are ontologically different to the material states that they are explaining and cannot therefore be understood as corresponding to them. In taking this view, does critical realism let go of the hand of truth?
I have noticed some scientists/scholars to equate reality to facts and facts to reality with assertion. In my opinion using them synonymously is a fallacy which must be consciously avoided, because in: Facts are statements about some events or circumstances that exist or that have occurred. Facts are observable (measurable), verifiable and indisputable whatever measure of reason and logic is applied to or reject them.
Reality (Constructed, Objective, Subjective, Empirical, Instrumental and other Realities) is nothing but a collective opinion - an idea in which some confidence is placed or, a reasonable collective representation of “the way things are.” Reality is not simply acknowledged, but must be discovered or reasoned and is liable to falsification.
For example, we know it is fact day will come after night. It is a fact that the Earth rotates on its axis resulting in day and night. It can be verified or observed from space. It also can be verified that the Earth revolves around the Sun. On the basis of these two facts we reckon time. But, what is reality of time? To some it is linear, to some opinions it is cyclic and to some it is fractal. To convince one of one of these three realities of time, it is to be reasoned out on the base of some facts.
There is an objective reality out there, but we view it through the spectacles of our beliefs, attitudes, and values. ~David G. Myers
I ask because I am interested in the view of scientists as to the current standing of these theories in the scientific community. I can see the study as valid from a philosophical viewpoint; however, I don't quite see the scientific method (in its classical sense) used in this form of research. As a non-scientist, I am interested in your perspectives.
Does mechanical engineering have philosophical causes?
Scale is an admitted aspect of space and is perceived as a continuum of space.
If we attempt to measure the distance between objects at very different scales (say the corner of a book on a table and a molecule of a pen on the table near the book), we find we must include scale as part of the means of locating the objects in space.
From a geometric perspective, this would mean that scale is a required measure of space - beyond length, width, and height and simply by being required to locate an object in space would constitute a 4th dimension of space.
Scale as continuum:
Scale as spacial dimension:
a) How would dualism explain causation between concepts generated in the mind and the interactions with the brain and the body?
b) There must be some energy expenditure involve that must be quantifiable in terms of physical laws (I guess giving rise to some sort of "ether" concept), is this correct?
c) Would concepts like "Pegasus" (from Quine's exaples) differ from say the memory of a person in terms of tangibility in this "ether" ?