Book

The Antropic Cosmological Principle

Authors:

Abstract

Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
... Dimensionality of spacetime is a question with a long history [1][2][3][4][5][6][7][8]. Starting from Ehrenfest [1,2], who argued based on the "stability postulate" of the two-body problem, the fundamental laws of physics favor (3 + 1) dimensions. ...
... The above discussion reverberates the anthropic principle [5,6,[90][91][92] in one way or another, though it only relies on the assumptions regardless of the anthropic reasoning. However, it tells more than that because gravitationally bound states of monatomic fluid are genuinely unstable and bound to black holes immediately in space dimensions higher than three. ...
... Although a negative cosmological constant could stabilize the fluid spheres, it decelerates the expansion of the whole universe on a large scale. This leads to another issue on the anthropic bound of cosmological constant [6,92,93], even though the falsifiability has been criticized [94,95]. In (3 + 1), the anthropic bound on the positive cosmological constant [96] argues that it could not be very large, or the universe would expand too fast for galaxies or stars (or us) to form. ...
Article
Full-text available
In this note, I derive the Chandrasekhar instability of a fluid sphere in (N + 1)-dimensional Schwarzschild–Tangherlini spacetime and take the homogeneous (uniform energy density) solution for illustration. Qualitatively, the effect of a positive (negative) cosmological constant tends to destabilize (stabilize) the sphere. In the absence of a cosmological constant, the privileged position of (3 + 1)-dimensional spacetime is manifest in its own right. As it is, the marginal dimensionality in which a monatomic ideal fluid sphere is stable but not too stable to trigger the onset of gravitational collapse. Furthermore, it is the unique dimensionality that can accommodate stable hydrostatic equilibrium with a positive cosmological constant. However, given the current cosmological constant observed, no stable configuration can be larger than 1021M⊙. On the other hand, in (2 + 1) dimensions, it is too stable to collapse either in the context of Newtonian Gravity (NG) or Einstein’s General Relativity (GR). In GR, the role of negative cosmological constant is crucial not only to guarantee fluid equilibrium (decreasing monotonicity of pressure) but also to have the Bañados–Teitelboim–Zanelli (BTZ) black hole solution. Owing to the negativeness of the cosmological constant, there is no unstable configuration for a homogeneous fluid disk with mass 0<M≤0.5 to collapse into a naked singularity, which supports the Cosmic Censorship Conjecture. However, the relativistic instability can be triggered for a homogeneous disk with mass 0.5<M≲0.518 under causal limit, which implies that BTZ holes of mass MBTZ>0 could emerge from collapsing fluid disks under proper conditions. The implicit assumptions and implications are also discussed.
... This led to substantial work, including the development of Dirac's later [1969/74] cosmology and other alternative cosmologies, modifications of GTR, and empirical tests. See [Ray, 2007[Ray, /2019, [Solà, 2015], [Barrow, 1986] for good reviews, that capture the main historical development, up to their point in time. 1 In fact several physicists had speculated about the concept of law-like relations before Dirac, notably [Weyl, 1917[Weyl, , 1919, [Eddington 1931], [Milne, 1935]. We may refer to the generic hypothesis of "Large Number Relations" (LNR's), as the proposal that there are lawlike relations of some kind between the dimensionless numbers, not necessarily those in Dirac's early LNH, but that was what drew the most attention to them. ...
... See [Ray, 2007[Ray, /2019, [Solà, 2015], [Barrow, 1986] for good reviews, that capture the main historical development, up to their point in time. 1 In fact several physicists had speculated about the concept of law-like relations before Dirac, notably [Weyl, 1917[Weyl, , 1919, [Eddington 1931], [Milne, 1935]. We may refer to the generic hypothesis of "Large Number Relations" (LNR's), as the proposal that there are lawlike relations of some kind between the dimensionless numbers, not necessarily those in Dirac's early LNH, but that was what drew the most attention to them. ...
... Dirac's own theory/s was abandoned in mainstream cosmology because of conflicts with the experimental data, and the majority opinion among most physicists is probably dismissive of LNR's, if they are aware of the concept at all. 1 [Ray, 2019] gives an extensive review and bibliography of the subject, with about 138 references, the earlier (unpublished) 2007 version has 121 references. This appears to be the broadest literature review in 2007, and probably up to 2019. ...
Article
Full-text available
In his [1937, 1938], Paul Dirac proposed his “Large Number Hypothesis” (LNH), as a speculative law, based upon what we will call the “Large Number Coincidences” (LNC’s), which are essentially “coincidences” in the ratios of about six large dimensionless numbers in physics. Dirac’s LNH postulates that these numerical coincidences reflect a deeper set of law-like relations, pointing to a revolutionary theory of cosmology. This led to substantial work, including the development of Dirac’s later [1969/74] cosmology, and other alternative cosmologies, such as Brans-Dicke modifications of GTR, and to extensive empirical tests. We may refer to the generic hypothesis of “Large Number Relations” (LNR’s), as the proposal that there are lawlike relations of some kind between the dimensionless numbers, not necessarily those proposed in Dirac’s early LNH. Such relations would have a profound effect on our concepts of physics, but they remain shrouded in mystery. Although Dirac’s specific proposals for LNR theories have been largely rejected, the subject retains interest, especially among cosmologists seeking to test possible variations in fundamental constants, and to explain dark energy or the cosmological constant. In the first two sections here we briefly summarise the basic concepts of LNR’s. We then introduce an alternative LNR theory, using a systematic formalism to express variable transformations between conventional measurement variables and the true variables of the theory. We demonstrate some consistency results and review the evidence for changes in the gravitational constant G. The theory adopted in the strongest tests of Ġ/G, by the Lunar Laser Ranging (LLR) experiments, assumes: Ġ/G = 3(dr/dt)/r – 2(dP/dt)/P – (dm/dt)/m, as a fundamental relationship. Experimental measurements show the RHS to be close to zero, so it is inferred that significant changes in G are ruled out. However when the relation is derived in our alternative theory it gives: Ġ/G = 3(dr/dt)/r – 2(dP/dt)/P – (dm/dt)/m – (dR/dt)/R. The extra final term (which is the Hubble constant) is not taken into account in conventional derivations. This means the LLR experiments are consistent with our LNR theory (and others), and they do not really test for a changing value of G at all. This failure to transform predictions of LNR theories correctly is a serious conceptual flaw in current experiment and theory.
... Barrow & Tipler [15] pinpoint that, for Euclidean space R N , it is only for N = 4 that the number of differentiable structures is uncountable. More precisely, producing a different proof of Donaldson's results, Freedman & Uhlenbeck [44] pinpoint for dimension 4 a result of Quinn's that "all compact 4-manifolds are almost smooth, that is smoothable in the complement of any point" while "in lower dimensions any topological manifold admits a unique smooth structure (up to diffeomorphism) and in higher dimensions smooth structures correspond to reductions of the tangent bundle". ...
... The consistency of the universe is a 'key issue' ([7], 2.5), and [15] even conclude on a logically closing, complete universe. Would however such a completeness be compatible with Gödel's incompleteness theorem? ...
... Barrow & Tipler [15] interpret their model as that "the universe, which is defined as everything that actually exists, is equal to all logically consistent possibilities 1 " This emphasizes the major issue of the logical consistent completeness of such models or even of generally modeling the evolution of the universe toward some predefined end or boundary apart... a mere consistency requirement. In that sense, a recursively enumerated process (type 0 language), such as proposed herein, should do better than a less powerful, always terminating type 1 (algorithm), questioning theories such as Misner's [15], whereby an homogeneous and isotropic structure derives from chaos, or such as a Weyl curvature filled ending universe 'à la Penrose', or even the now observed 'all scale structure' joyfully mixing extremes ranging from cosmic black holes to cosmic voids, hence as discussed in the sections 4 (observational references) and 5 (discussion) of the present paper. ...
Article
Full-text available
... If this is correct, we will never be able to explain our physical laws in the same way we explain other features of the universe, such as the size of an atom, through relating them to more fundamental quantities [3]. In this scenario, many of the universe's peculiar features and behaviors are instead understood by an alternative type of explanation known as the anthropic principle [4,5]. By invoking this, we start with the tautology that we can only exist in universes capable of supporting complex life, and use ensuing selection effects to explain otherwise puzzling aspects of our universe. ...
... This equates to just 156 keV for our values 5 . The flip side of such marginal stability is the fact that in our universe, carbon-14 decays with extremely tiny reaction energy. ...
... This consideration only practically affects regions for parameter space where β is 300 times smaller, and so can be safely disregarded. 5 Application of the SEMF in this case actually predicts this reaction to be marginally allowed in our universe, a consequence of the actual value being below the SEMF's typical accuracy threshold. To correct for this, we set the additive multiple of the proton mass to correspond to the value we infer from the reverse reaction. ...
Article
Full-text available
We investigate the dependence of elemental abundances on physical constants, and the implications this has for the distribution of complex life for various proposed habitability criteria. We consider three main sources of abundance variation: differing supernova rates, alpha burning in massive stars, and isotopic stability, and how each affects the metal-to-rock ratio and the abundances of carbon, oxygen, nitrogen, phosphorus, sulfur, silicon, magnesium, and iron. Our analysis leads to several predictions for which habitability criteria are correct by determining which ones make our observations of the physical constants, as well as a few other observed features of our universe, most likely. Our results indicate that carbon-rich or carbon-poor planets are uninhabitable, slightly magnesium-rich planets are habitable, and life does not depend on nitrogen abundance too sensitively. We also find suggestive but inconclusive evidence that metal-rich planets and phosphorus-poor planets are habitable. These predictions can then be checked by probing regions of our universe that closely resemble normal environments in other universes. If any of these predictions are found to be wrong, the multiverse scenario would predict that the majority of observers are born in universes differing substantially from ours, and so can be ruled out, to varying degrees of statistical significance.
... According to FT, the constants in the laws of nature and/or the boundary conditions in the standard models of physics must belong to intervals of low probability in order for life to exist. Since its inception, FT has generated a great deal of fascination, seen in multiple divulgation books (e.g., [18][19][20][21]) and scientific articles (e.g., [22][23][24][25]). For a given constant of nature X, the connection between FT and active information can be described in three steps: ...
... Hence, if P = P θt , then X = X t corresponds to observing the Markov chain at time t, under the alternative hypothesis H 1 in (3). Some basic properties of the corresponding actinfo are summarized in the following proposition, which is proved in Section 7: Proposition 2. Suppose that X = X t is obtained by iterating t times a Markov chain with initial distribution (17) and transition kernel (18). The actinfo then equals ...
... The theory of phase-type distributions can then be used to compute the target probability P θts (A) in (23) [39,40]. To this end, clump all states x ∈ A into one absorbing state, and decompose the transition kernel in (18) according to ...
Article
Full-text available
A general framework is introduced to estimate how much external information has been infused into a search algorithm, the so-called active information. This is rephrased as a test of fine-tuning, where tuning corresponds to the amount of pre-specified knowledge that the algorithm makes use of in order to reach a certain target. A function f quantifies specificity for each possible outcome x of a search, so that the target of the algorithm is a set of highly specified states, whereas fine-tuning occurs if it is much more likely for the algorithm to reach the target as intended than by chance. The distribution of a random outcome X of the algorithm involves a parameter θ that quantifies how much background information has been infused. A simple choice of this parameter is to use θf in order to exponentially tilt the distribution of the outcome of the search algorithm under the null distribution of no tuning, so that an exponential family of distributions is obtained. Such algorithms are obtained by iterating a Metropolis–Hastings type of Markov chain, which makes it possible to compute their active information under the equilibrium and non-equilibrium of the Markov chain, with or without stopping when the targeted set of fine-tuned states has been reached. Other choices of tuning parameters θ are discussed as well. Nonparametric and parametric estimators of active information and tests of fine-tuning are developed when repeated and independent outcomes of the algorithm are available. The theory is illustrated with examples from cosmology, student learning, reinforcement learning, a Moran type model of population genetics, and evolutionary programming.
... According to FT, the constants in the laws of nature and/or the boundary conditions in the standard models of physics must belong to intervals of low probability in order for life to exist. Since its inception, FT has generated a great deal of fascination, seen in multiple divulgation books (e.g., [18][19][20][21]) and scientific articles (e.g., [22][23][24][25]). For a given constant of nature X, the connection between FT and active information can be described in three steps: ...
... Proposition 2. Suppose X = X t is obtained by iterating t times a Markov chain with initial distribution (17) and transition kernel (18). The actinfo then equals ...
... The theory of phase-type distributions can then be used to compute the target probability P θts (A) in (23) [39,40]. To this end, clump all states x ∈ A into one absorbing state, and decompose the transition kernel in (18) according to ...
Preprint
Full-text available
A general framework is introduced to estimate how much external information has been infused into a search algorithm, the so-called active information. This is rephrased as a test of fine-tuning, where tuning corresponds to the amount of pre-specified knowledge that the algorithm makes use of in order to reach a certain target. A function f quantifies specificity for each possible outcome x of a search, so that the target of the algorithm is a set of highly specified states, whereas fine-tuning occurs if it is much more likely for the algorithm to reach the target than by chance. The distribution of a random outcome X of the algorithm involves a parameter θ\theta that quantifies how much background information that has been infused. A simple choice of this parameter is to use θf\theta f in order to exponentially tilt the distribution of the outcome of the search algorithm under the null distribution of no tuning, so that an exponential family of distributions is obtained. Such algorithms are obtained by iterating a Metropolis-Hastings type of Markov chain, and this makes it possible to compute the their active information under equilibrium and non-equilibrium of the Markov chain, with or without stopping when the targeted set of fine-tuned states has been reached. Other choices of tuning parameters θ\theta are discussed as well. Nonparametric and parametric estimators of active information and tests of fine-tuning are developed when repeated and independent outcomes of the algorithm are available. The theory is illustrated with examples from cosmology, student learning, reinforcement learning, a Moran type model of population genetics, and evolutionary programming.
... Testing the predictions of HTUM against the cosmological principle may provide crucial insights into the model's validity and the fundamental assumptions underlying modern cosmology. • Anthropic Principle: The cyclical nature of HTUM and the potential for multiple universes within the toroidal structure may have implications for the anthropic principle, which attempts to explain the apparent fine-tuning of the universe for the emergence of life and consciousness [47][48][49]. The model's framework may provide a natural explanation for the existence of a universe with the necessary conditions for the development of complex structures and intelligent life without relying on the controversial notion of a multiverse or the fine-tuning of initial conditions. ...
... The Ultimate Fate of the Universe HTUM also offers a novel perspective on the universe's ultimate fate. Instead of a linear progression towards heat death or a cyclical pattern of expansion and contraction, HTUM suggests that the universe's evolution is a continuous process of transformation and self-actualization [47]. This implies that the universe's fate is not predetermined but influenced by the dynamic interplay between conscious agents and the underlying singularity [368]. ...
Preprint
Full-text available
The Hyper-Torus Universe Model (HTUM) is a novel framework that unifies quantum mechanics, cosmology, and consciousness, proposing that the universe is a higher-dimensional hyper-torus containing all possible states of existence. This paper explores the fundamental concepts and implications of HTUM, which suggests that the universe is a quantum system in which all possible outcomes are inherently connected, with consciousness playing a crucial role in actualizing reality. HTUM addresses critical challenges in modern physics, such as the nature of quantum entanglement, the origin of the universe, and the relationship between mind and matter. By introducing concepts like singularity, quantum entanglement at a cosmic scale, and the self-actualization of the universe, HTUM provides a comprehensive framework for understanding the fundamental nature of reality. This paper discusses the mathematical formulation of HTUM, its implications for quantum mechanics and cosmology, and its potential to bridge the gap between science and philosophy. HTUM represents a significant shift in our understanding of the universe and our place within it, inviting further research and exploration into the nature of reality and consciousness.
... In the contemporary status of the searching program (e. g. DARWIN space infrared interferometer project) the following categories of extra-solar planets are described: Definite planets (20), possible planets (8), microlensed planets (5), borderline planets (2), dust clump planets (7) and pulsar planets (4), number in paranthesis denotes the number of planet. 1 It is well known that round the Sun the habitable zone -Ecosphere exists. Within the Sun Ecosphere are: Venus, Earth and Mars and Sun. ...
... For, it is well known that grand unified theories allow very sharp limits to be placed on the possible vales of the fine structure constant in a cognizable universe. The possibility of the doing physics on the background space-time at the unification energy and the existence of stars made of protons and neutrons endorse α in the niche [4]: ...
Research Proposal
Full-text available
Following the coincidence A x atomic year ∼ Earth year (s), (A =Avogardo number, atomic year= a B /αc, a B = Bohr radius, α = fine structure constant, c = light velocity) and considering the ,,niche" for α, i. e. 180 −1 ≤ α ≤ 85 −1 , the Ecosphere radius is calculated.
... with ρ Λ ∈ R being the vacuum energy's density, and ρ φ -a variable density of a dynamical dark energy component, which in a simplest case act as certain fundamental scalar field φ. Using (1) Vilenkin and Garriga have proposed an elegant solution to the problem of small observable cosmological constant which not only fits the observations (see [8][9][10]) but also naturally solves an adjacent problem of cosmic coincidences. For details, we shall refer the interested reader to their article 3 , but we will point out one important fact: the approach of Vilenkin and Garriga implies the existence of at least one global scalar field, which depends only on time (or, to be more precise, has negligibly small spatial gradients). ...
... Finally, we have left open an interesting question about the relationship between the cosmologic and thermodynamic arrows of time. The brightest example of it are the stars 9 . From the common point of view, stars are the places where the thermodynamic gradient is particularly steep, so the temporal asymmetry of radiation can be considered a consequence of the thermodynamic asymmetry. ...
Article
Full-text available
We demonstrate that the cosmological arrow of time is the cause for the arrow of time associated with the retarded radiation. This implies that the proposed mathematical model serves to confirm the hypothesis of Gold and Wheeler that the stars radiate light instead of consuming it only because the universe is expanding—just like the darkness of the night sky is a side-effect of the global cosmological expansion.
... However, there are not observable/experimental tests to give us any assurance that an inflation epoch really left measurable effects. The explanation for flatness may be the anthropic principle [1], that intelligent life would only arise in those patches of universe with Ω very close to 1; another explanation could be that space is precisely flat, so that K = 0 now and always. Guth's monopoles may be explained by inflation, or the physics may be such that they never existed in appreciable abundances. ...
Preprint
The paradigm of \Lambda CDM cosmology works impressively well and with the concept of inflation it explains the universe after the time of decoupling. However there are still a few concerns; after much effort there is no detection of dark matter and there are significant problems in the theoretical description of dark energy. We will consider a variant of the cosmological spherical shell model, within FRW formalism and will compare it with the standard \Lambda CDM model. We will show that our new topological model satisfies cosmological principles and is consistent with all observable data, but that it may require new interpretation for some data. Considered will be constraints imposed on the model, as for instance the range for the size and allowed thickness of the shell, by the supernovae luminosity distance and CMB data. In this model propagation of the light is confined along the shell, which has as a consequence that observed CMB originated from one point or a limited space region. It allows to interpret the uniformity of the CMB without inflation scenario. In addition this removes any constraints on the uniformity of the universe at the early stage and opens a possibility that the universe was not uniform and that creation of galaxies and large structures is due to the inhomogeneities that originated in the Big Bang.
... This function serves as a hypothetical mapping metric for relative time dilation values across the universe, dynamically influenced by changing mass and velocities [6,7]. ...
Preprint
Full-text available
This paper proposes a theoretical framework for understanding time dilation across the universe as a dynamic, evolving time map. We construct a time map that changes with cosmic evolution by associating each point in space with a relative time dilation value, determined by gravitational and velocity influences. We introduce the concept of time entropy as a measure of the diversity of time experiences across space, which increases as the universe expands and mass redistributes. This framework also suggests a multilayered reality of time-dependent observations, where specific phenomena may only be observable within certain time zones, creating a zone-limited cosmic landscape. The implications of this theory span observational astrophysics, cosmology, and theoretical physics.
... If we feign ignorance and assume a uniform Bayesian prior on the state of the Universe, then we effectively treat the Universe as if it were at heat death. By the second law of thermodynamics, we can never return to a state of nonignorance, as every observation would be suspected of being a random "Boltzmann brain" fluctuation [33,80,81]. As explained by Wolpert and Kipper [12], this issue is closely related to formal impossibility results in the theory of inductive inference [82,83]. ...
Article
Full-text available
Why do we remember the past, and plan the future? We introduce a toy model in which to investigate emergent time asymmetries: the causal multibaker maps. These are reversible discrete-time dynamical systems with configurable causal interactions. Imposing a suitable initial condition or “Past Hypothesis”, and then coarse-graining, yields a Pearlean locally causal structure. While it is more common to speculate that the other arrows of time arise from the thermodynamic arrow, our model instead takes the causal arrow as fundamental. From it, we obtain the thermodynamic and epistemic arrows of time. The epistemic arrow concerns records, which we define to be systems that encode the state of another system at another time, regardless of the latter system’s dynamics. Such records exist of the past, but not of the future. We close with informal discussions of the evolutionary and agential arrows of time, and their relevance to decision theory.
... The anthropic principle [102,103] can explain the cosmological constant problems and answer the questions: "Why is the energy density of the cosmological constant so small?" and "Why has the accelerated expansion of the universe started recently?" According to the anthropic principle, the energy density of the cosmological constant observed today should be suitable for the evolution of intelligent beings in the universe [92,[104][105][106]. ...
Article
Full-text available
Scalar field ϕCDM models provide an alternative to the standard ΛCDM paradigm, while being physically better motivated. Dynamical scalar field ϕCDM models are divided into two classes: the quintessence (minimally and non-minimally interacting with gravity) and phantom models. These models explain the phenomenology of late-time dark energy. In these models, energy density and pressure are time-dependent functions under the assumption that the scalar field is described by the ideal barotropic fluid model. As a consequence of this, the equation of state parameter of the ϕCDM models is also a time-dependent function. The interaction between dark energy and dark matter, namely their transformation into each other, is considered in the interacting dark energy models. The evolution of the universe from the inflationary epoch to the present dark energy epoch is investigated in quintessential inflation models, in which a single scalar field plays a role of both the inflaton field at the inflationary epoch and of the quintessence scalar field at the present epoch. We start with an overview of the motivation behind these classes of models, the basic mathematical formalism, and the different classes of models. We then present a compilation of recent results of applying different observational probes to constraining ϕCDM model parameters. Over the last two decades, the precision of observational data has increased immensely, leading to ever tighter constraints. A combination of the recent measurements favors the spatially flat ΛCDM model but a large class of ϕCDM models is still not ruled out.
... Historically, biology has included teleology in one of two ways-externalist teleology (where teleology outside the organism plays a defining role) or internalist teleology (where the focus is on the purposes of the organism itself). Interestingly, while modern biology generally excludes external teleology in its causal toolkit, it can now be found as an undercurrent within physics in the question of cosmological fine-tuning (Barrow and Tipler 1998). While less favored in biology, discussions about biological fine-tuning have started to appear (Carr and Rees 2003;Bialek and Setayeshgar 2005;Thorvaldsen and Hössjer 2020), sometimes even connecting an externalist teleological framework of physics to the emergence of life as we know it (Morris 2004; Barrow et al. 2007). ...
Article
Full-text available
Teleological causes have been generally disfavored in biological explanations because they have been thought to lack rigor or act as stand-ins for non-teleological processes which are simply not yet understood sufficiently. Teleological explanations in biology have been limited to only teleonomic causes, which are teleological causes that are due to codes or similarly reified mechanisms. However, advances in the conceptualization of teleological and teleonomic causa-tion have allowed for more quantitative analyses of both. Additionally , although teleonomy has been historically excluded from potential causes of evolution, new research has shown that teleonomy actually plays a significant role in evolution. Combining these advances with advances in computability theory and information theory have allowed for a more rigorous and quantitative analysis of the capabilities and limitations of teleonomy in evolution.
... The third "line of attack/defence" by physicists are books for the general public that explain big philosophical questions. Apart from the already mentioned writers, we add such authors as Piergiorgio Odifreddi (mathematician) and Antonino Zichichi (experimental nuclear physicist at CERN) [16] in Italy, John Barrow (professor in mathematical physics) and John Tipler (experimental physicist) [17] in the UK, Michał Heller (priest and professor of cosmology) in Poland [18], and many others. Their books have gained popularity thanks to their interdisciplinary (and philosophical) approach. ...
Article
Full-text available
The global availability of information makes its selection difficult, but at the same time it allows for the construction of teaching without the particular prior knowledge of students. However, it requires teachers to learn new abilities, such as developing a much broader coverage of the subject, explanations of illy solutions, and knowledge of different ways of thinking and the mental needs of pupils (pedagogical knowledge contents). We show examples of such teaching in physics in several quite different environments: from school classes to workshops for 3–4-year-old children, interactive lectures for children’s universities, ad hoc explanations in science museums for secondary school students, to public lectures in didactics at international congresses. Every specific environment requires different approaches, but the contents may remain similar: innovative, constructivist, and interactive approaches assure a successful outcome in any didactical situation.
... These issues are studied by cosmology with cosmogony, stellar astrophysics, and planetology. The detailed description and explanation of the reasons and nature of the Universe structuring are extremely important for the analysis of past and future terrestrial life, including the fate of the mankind, and in connection with the anthropic principle [10]. ...
Article
Full-text available
We analyze the development of the natural sciences according to the scheme: “… reality – attributes – reality – attributes”. Any material reality is interpreted as a carrier of attributes manifested in a certain range of its experimental study and at a certain theoretical level of data understanding. More sophisticated experiments and refined theories lead to a more detailed and correct vision of realities and their attributes. The latter initiates new experiments and creation of new theories. These cognitive processes lead to the discovery of new material realities with their additional or refined attributes compared to those previously known. The objectivity and relative truth of scientific statements about the realities under study and their attributes are based on a qualitative or quantitative correspondence between theoretically calculated values of attributes and their experimentally measured values. We emphasize that the isolation and adequate historical and philosophical analysis of those cycles requires professional knowledge of the analyzed complex scientific material and cannot be implemented within the framework of oversimplified ideas about science and the role, which theories play in it. Examples of such ideas are the understanding of theories solely in terms of their refutation and confirmation or even the replacement of theories with fuzzy sociological concepts such as a paradigm and sociocultural determination of the scientific results. Those views often become the source of idealistic, irrational, and postmodernist interpretations of science and its history. Key words: Material realities and their attributes, attributes and their values, experiments, theories, qualitative and quantitative correspondence between calculated and measured values of attributes
... Thus, the anthropic principle is extended to the economic field on the assumption that the human survival is linked to the planet's resources and therefore to its ability to dispose of the waste produced by our presence on the planet (Barrow & Tipler, 1986). ...
Article
Full-text available
Abstract The Sustainability in Economics as a Key for Future Prospects There must be a balance position, a point where the potential and possible harmony can be established between humankind and Nature in terms of development. It should be considered that the human development along the economic path is an absolute and primary need, and this because growth also means well-being and solidarity, but above all, a good growth gives a boost tothe progress of science and research. Over time, science will lead us to an ab-solute compatibility with the environment and in the meantime, we are offered the possibility of minimizing the damage inflicted on Mother Nature. To correct those damages, the conjuncture cycle inserts the recession into the phases of development. The recession i.e., essentially a periodic pause, is necessary to allow nature to digest the excess waste produced by human activities. Moreover, there is the possibility of a serious alteration in the development process during which we can witness the occurrence of monetary alterations (inflation and deflation), which are anyway nothing more than messages sent by nature to the community. The collectivity must in fact realize that the economic train has derailed from the right and linear path. With these premises, we can sustain that the magical balance that brings the values of optimal development and those of the cost of living closer to 2% is a weak approximation of the lasting and constant balance that the economy would achieve in the event, now still away, of a realized neutrality of our presence on the planet. Even while waiting to achieve the stability of our relationship with nature, we can therefore maintain a condition of relative stability, which con-sists of a linear growth limited to 2% - 2.50% on annual basis, with a long-lasting feature if at the same time the principle of the least possible damage to the environment is respected. A magic combination therefore that contextually may see the convergence of development of economy with the constant value close to 2%, which ensures the potential stability of the relationship with the natural world. This constant value is close to 2%, that is indicated by the Central Banks as equilibrium of economy condition, be considered a constant universal as the great universal constants of physics and matter. At last, how to explain the accelerated pace that some territories have shown over the centuries like, last China. Some indication tells us that the accelerated pace in development may not be a symptom of a good government and good relationship with Nature. (PDF) The Sustainability in Economics as a Key for Future Prospects. Available from: https://www.researchgate.net/publication/366320085_The_Sustainability_in_Economics_as_a_Key_for_Future_Prospects [accessed Jun 27 2023].
... The fluctuation theorem (Evans & Searles 1994), however, shows that small low-entropy universes are exponentially more likely to exist than large ones and, coupled to the anthropic hypothesis, one concludes that the Universe does not need to be this large for sentient beings to exist. One may therefore safely conclude that the anthropic principle is not a factor in this argument (Carter 1973;Barrow 1988). The Boltzmann equilibrium hypothesis turns out to be no more 'natural' than the highly fine-tuned initial conditions required for inflation to work. ...
Article
Full-text available
Modern cosmology is broadly based on the Cosmological principle, which assumes homogeneity and isotropy as its foundational pillars. Thus, there isn't much debate about the metric (i.e., Friedmann-Lemaître-Robertson-Walker; FLRW) one should use to describe the cosmic spacetime. But Einstein's equations do not unilaterally constrain the constituents in the cosmic fluid, which directly determine the expansion factor appearing in the metric coefficients. As its name suggests, ΛCDM posits that the energy density is dominated by a blend of dark energy (typically a cosmological constant, Λ), cold dark matter (and a 'contamination' of baryonic matter) and radiation. Many would assert that we have now reached the age of 'precision' cosmology, in which measurements are made merely to refine the excessively large number of free parameters characterizing its empirical underpinnings. But this mantra glosses over a growing body of embarrassingly significant failings, not just 'tension' as is sometimes described, as if to somehow imply that a resolution will eventually be found. In this paper, we take a candid look at some of the most glaring conflicts between the standard model, the observations, and several foundational principles in quantum mechanics, general relativity and particle physics. One cannot avoid the conclusion that the standard model needs a complete overhaul in order to survive.
... In 2018, a group of four physicists advanced a controversial conjecture implying that no such Universe exists [20]. The very small window of Λ values that allows life to emerge led some cosmologists to propose the anthropic argument for the existence of a small cosmological constant [21][22][23][24]. ...
Preprint
Full-text available
Following a previous article [1], in this paper, we investigate a version of the Standard Model (SM) algebroid with the anchor map depending on the Electric Charge Swap (ECS) angle θs. We find that many SM algebras depend on θs. We call these ECSM algebras. Furthermore, the SM algebroid is integrable to the SM groupoid; our results, therefore, potentially extend well beyond this case. We then investigate how the massive ECS particle can be derived from the breaking of the symmetry of the SM groupoid. We find that the ECS particle mass is related to the SM particle mass through the ECS angle θs. We investigate the finite subgroups of the ECS Möbius transformations. The ECS angle θs could originate from the ECS dihedral group that refers to the symmetry of the Particle polygon (P-gon). This angle can then be determined through the multi-triangulation of a convex particle P-gon. Finally, we find that, at loop-level, the ECS Physics is different from the SM physics, and the ECSM mass is suppressed by the particle Catalan numbers CP. For 24-fermions [1], and 6-vector gauge bosons, the calculated one-loop radiative correction to the bare cosmological constant Λ0 is 10-47GeV4—very close to the experimental value
... Other constants such as the Planck constant , the speed of light c, the electron charge e, etc., are the same in all i copies of GR theory. The resulting overall picture could be the theoretical basis of a restricted type of anthropic principle [94][95][96][97][98][99][100][101][102][103][104][105] in a universe where gravity is governed by the GR laws. ...
Preprint
In this paper we discuss on the phenomenological footprints of theories where the gravitational effects are due not only to spacetime curvature, but also to nonmetricity. These theories are characterized by gauge invariance. Due to their simplicity, here we focus in theories with vectorial nonmetricity. We make special emphasis in gradient nonmetricity theories which are based in Weyl integrable geometry (WIG) spaces. While arbitrary and vectorial nonmetricities may have played a role in the quantum epoch, gradient nonmetricity can be important for the description of gravitational phenomena in our classical world instead. This would entail that gauge symmetry may be an actual symmetry of our past, present and future universe, without conflict with the standard model of particles (SMP). We show that, in a gauge invariant world modeled by WIG spacetime, the vacuum energy density is a dynamical quantity, so that the cosmological constant problem (CCP) may be avoided. Besides, due to gauge invariance, and to the fact that photons and radiation do not interact with nonmetricity, the accelerated pace of cosmic expansion can be explained without the need for the dark energy. We also discuss on the ``many-worlds'' interpretation of the resulting gauge invariant framework, where general relativity (GR) is just a specific gauge of the theory. The unavoidable discrepancy between the present value of the Hubble parameter computed on the GR basis and its value according to the gauge invariant theory, may explain the Hubble tension issue. It will be shown also that, due to gauge freedom, inflation is not required in order to explain the flatness, horizon and relict particles abundance problems within the present framework.
... [Thanks to an anonymous referee for suggesting this clarification.] 5 This effect often, but not always, goes by the name "the (weak) anthropic principle"[Barrow and Tipler (1986);Leslie (1989);Bostrom (2002)].6 Manson [(2020), pp. ...
Article
Full-text available
The multiverse hypothesis is one of the leading proposed explanations of cosmic fine‐tuning for life. One common objection to the multiverse hypothesis is that, even if it were true, it would not explain why this universe, our universe, is fine‐tuned for life. To think it would so explain is allegedly to commit “the inverse gambler's fallacy.” This paper presents what the inverse gambler's fallacy is supposed to be, then surveys the discussion of it in the philosophical literature of the last 35 years.
... However, this does not invalid SETI searches of any kindindeed, it requires them in a Popperian sense to attempt to disprove the hypothesis (Ellery, 2003). The self-replicating machine concept has profound implications: (i) for the long-term growth of humanity as a cosmological phenomenon (Barrow and Tipler, 1998) in transitioning from a Kardashev type 1 civilization (that consumes energy on a planetary scale) to a type 3 civilization (that consumes energy on a galactic scale) (Kardashev, 1997)), (ii) for our relationship with artificial intelligence required for a self-replicating machine (Bostrom, 2014), (iii) for the implications of selfreplication technology for the non-existence of extraterrestrial intelligence and in providing humanity with first mover advantages in interstellar exploration (Ellery, 2019b). These aspects are more speculative but nevertheless suggest that the self-replicating machine is the ultimate machine affording unchallenged cosmological power to the human species over the longest term. ...
Article
Full-text available
In the early 1980s, the Sagan-Tipler debate raged regarding the interpretation of the Fermi paradox but no clear winner emerged. Sagan favoured the existence of ETI on the basis of the Copernican principle and Tipler favoured the non-existence of ETI on the basis of the Occam's razor principle. Tipler's stance was an expansion of the similar but earlier Hart declaration. However, crucial to the Tipler argument was the role played by self-replicating interstellar robot probes. Any technologically capable species will develop self-replication technology as the most economical means of exploring space and the Galaxy as a whole with minimal investment. There is no evidence of such probes in our solar system including the asteroid belt, ergo, ETI do not exist. This is a powerful and cogent argument. Counter-arguments have been weak including Sagan's sociological explanations. We present a Copernican argument that ETI do not exist – humans are developing self-replication technology today. We are developing the ability to 3D print entire robotic machines from extraterrestrial resources including electric motors and electronics as part of a general in-situ resource utilization (ISRU) capability. We have 3D-printed electric motors which can be potentially leveraged from extraterrestrial material that should be available in every star system. From a similar range of materials, we have identified a means to 3D print neural network circuitry. From our industrial ecology, self-replicating machines and indeed universal constructors are feasible. We describe in some detail how a self-replicating interstellar spacecraft may be constricted from asteroidal resources. We describe technological signatures of the processing of asteroidal material (which is expected to be common to most star systems), and the excess production of certain types of clay and other detritus materials. Self-replication technology is under development and imminent – if humans are pursuing self-replication technology, then by the Copernican principle, so would any technologically savvy species elsewhere. There is no evidence that they have.
Article
Full-text available
El final del segundo milenio ocasionó una explosión mundial de novelas que representaban el fin de la civilización humana y las posibles realidades posteriores, fenómeno del que España no se veía excluida. No sorprende que estas preocupaciones estuvieraníparticularmente presentes en la ciencia ficción (cf), género dedicado en gran parte a imaginar futuros. A diferencia de la cf (pos)apocalíptica finemilenial anglosajona o de otros paises hispanoparlantes, en la española se aprecia una mayor presencia de la religión, la cual suele representarse de forma institucionalizada, jerárquica, y hegemónica. Dicha presencia resulta casi paradójica en un género basado en la ciencia (si bien esta es especulativa y prospectiva), e inevitablemente da lugar a un conflicto (implícito o explícito) entre una cosmovisión espiritual y otra materialista. Se analizarán tres novelas que siguen este patrón, Punto Omega (2001), de Enrique del Barco; Mentes de noche y hielo (2001), de Eduardo Vaquerizo; y Tiempo prestado (2005), de José Miguel Pallarés y Amadeo Garrigós, a través de la lente de tres narrativas históricas tradicionalmente aceptadas como explicación oficial y definitiva de la realidad histórica española: las “Dos Españas”, “Spain is different” (el excepcionalismo), y el mito fundacional de la democracia Española: la Transición. Entrelazar las dos cosmovisones contradictorias sirve cognitivamente para interrogar estas metanarrativas a fin de entender mejor la historia y actualidad de la España finemilenial, y para sublimar el trauma todavía sin resolver de la Guerra Civil y el Franquismo, trauma silenciado y reprimido en nombre de la Transición a la democracia.
Preprint
Full-text available
Cosmologists have long sought to uncover the deepest truths of the universe, from the origins of the cosmos to the nature of dark matter and dark energy. However, what if the universe itself is designed to prevent such understanding? This paper presents the metaphor of the "falling elevator" as a conceptual trap for cosmologists, where the pursuit of knowledge is systematically thwarted by the very structure of reality. By exploring mechanisms like relativistic illusions, changing physical constants, fractal space-time, dimensional entanglement, cosmic censorship, observer-dependent realities, and recursive simulations, we illustrate how the universe might be fundamentally unknowable. In this scenario, cosmologists are trapped in a perpetual loop of incomplete discoveries and paradoxical observations, where every breakthrough only reveals deeper layers of complexity. The paper reflects on the philosophical implications of this thought experiment, questioning whether certain truths about the universe are inherently beyond human comprehension. Keywords: cosmology, simulation hypothesis, relativistic illusions, fractal space-time, dimensional entanglement, cosmic censorship, observer effect, quantum mechanics, recursive simulations, limits of knowledge, simulation, multiverse, quantum uncertainty, dark matter, dark energy, philosophical cosmology.
Article
In my quest to unravel the origins of the universe, I have delved deeply into the concept of singularity. This research paper presents my hypothesis that singularity is not merely an infinitely small point of infinite density but rather the source from which all dimensions emanate. By re-evaluating our understanding of dimensions and proposing frequency as a distinct dimension, I aim to provide a new perspective on the interconnectedness of energy, matter, and consciousness.
Chapter
What I call the attunement thesis says that all the formal and natural sciences are grounded in the metaphysics of weak or counterfactual transcendental idealism, and that the primary source of epistemic evidence for this grounding is our essentially non-conceptual, disinterested, and pure aesthetic experience of natural beauty. In Sects. 15.2 and 15.3, I argue (i) that the attunement thesis is true, and (ii) that unless the formal sciences and the natural sciences—principally, logic, mathematics, physics, and biology—(ii.a) were to acknowledge the truth of the attunement thesis and also (ii.b) were to incorporate the attunement thesis into their respective repertoires of foundational a priori principles, then those sciences would be metaphysically alienated from manifest natural reality, with nothing for those sciences to be either meaningful about or true of. Then in Sect. 15.4, building on the attunement thesis, I argue for a second thesis I call cosmic dignitarianism.
Chapter
In this chapter, I argue that contemporary physics, understood as a natural science that’s committed to the Standard Models of cosmology and particle physics, can explain itself only if it supplements the Standard Models by affirming a philosophically defensible and scientifically respectable version of The Anthropic Principle that’s also equivalent with a suitably weak version of Kant’s metaphysics of transcendental idealism.
Article
Full-text available
The field of Patristics, or early Christian and Mediaeval Studies, traditionally works along the lines of historical and literary criticism. But this method is not always useful, especially when it comes to complex objects and circumstances. No wonder the current trend of replacing it, more often than not, by interdisciplinary frameworks. The article begins accordingly by reviewing three interdisciplinary frameworks, namely, the "socio-historical method", "Deep Time", and archaeological theorist Roland Fletcher's "transitions", highlighting their suitability for a comprehensive approach to Patristic cosmology. Here, cosmology should not be taken in the narrow sense of contemporary science. It means both a way of representing reality-a worldview-and a way of inhabiting the world. The present article analyses the evolution of the early Christian and mediaeval perception of the environment and the cosmos in Greek sources, pointing to successive transitions from apprehension (cosmophobia) to a keen interest in understanding nature to the thought that holiness represents a universe-(re)making agency. It addresses relevant historical and social circumstances, but proposes that the above transitions were triggered by internal or existential factors as well, and not only external, thus complementing Fletcher's outline, which focuses upon external catalysts, such as economy and technology.
Chapter
The impact of the relationship between the theory of systems and the theory of complexity on some of the general philosophical questions is discussed. The special accent in this analysis is on the problem of the possible limits in complexity, and how this statement could change many of the aspects of the philosophy of science and the philosophy of art.
Conference Paper
Full-text available
Biosignature anomalies are crucial in the search for extraterrestrial life on exoplanets and planets similar to Earth for several reasons: 1. Detection of Life Biosignature anomalies can indicate the presence of life or life-supporting conditions. These anomalies could be chemical signatures like unusual concentrations of gases or substances that are typically associated with biological processes. 2. Understanding HabitabilityAnomalies in biosignatures provide insights into the habitability of a planet or exoplanet. They can reveal conditions that might support life or indicate environments where life could potentially thrive. 3. Differentiating Abiotic and Biotic Processes: Biosignature anomalies help scientists distinguish between abiotic (non-biological) and biotic (biological) processes. This differentiation is crucial in confirming whether a planet has life or not. 4. Evolutionary InsightsAnomalies in biosignatures can also provide clues about the evolutionary history of a planet or exoplanet. They can indicate past or present conditions that are conducive to life's emergence and development. 5. Targeting Future ExplorationIdentifying biosignature anomalies helps in prioritizing targets for future exploration missions. It allows scientists to focus on areas with a higher likelihood of finding signs of life. 6.Planetary Protection Understanding biosignature anomalies is important for planetary protection protocols. It helps in assessing the potential for contamination and the need to safeguard both Earth and other celestial bodies from harmful biological agents. Overall, biosignature anomalies play a crucial role in expanding our understanding of life beyond Earth and guiding our search for extraterrestrial life in diverse planetary environments.
Article
The article takes up for a closer scrutiny the concept of universal human history, i.e. of the Homo Sapiens, since the very beginning of the species around 70 thousand years ago. And already here surfaces the key characteristics of Y. N. Harari’s attempt to write a concise history – it is a history of an animal called Sapiens, ordered in the sequence of three major historical revolutions: cognitive, agricultural and scientific. The first one, 70 thousand years ago, sparked off our history and consisted in creating a language, the next enabled our rapid civilization development some 10 thousand years ago. And the last one has been in operation since about the year 1500. The author critically assesses human achievements, especially in terms of long-term interfering in natural and global processes and rejects any claims for human’s privileged position in nature.
Article
Full-text available
: In his works on ecological philosophy, Bruno Latour develops an interesting ontology. He proposes a new worldview, in which religion is reinterpreted in view of a Gaian philosophy. He extends ‘pluralism’ beyond the anthropocentrism that dominates modern humanism. In his book Facing Gaia Latour includes nonhuman beings in a larger community and works towards a larger concept of eco-humanism. In this paper, I try to reconstruct his position by showing that the philosophical foundation for his interpretation of ontology is to be classified as a form of new materialism. This new interpretation of materialism has postmodernist origins (inspired by Gilles Deleuze), but it is not identical to it, because Latour explicitly distances himself from ‘postmodernism’. He wants to contribute to a ‘positive’ ontology. My point is that Latour’s materialist grounding of ontology, which he tries to elaborate in order to make a religious pluralism possible, obstructs any foundation of transcendence and, finally, congests a pluralistic ecumene, because it renounces to the idea of the ‘whole’ and a unitary principle of being. His ideas on eco-humanism and pluralistic ecumene could gain momentum if we opted for a more holistic and idealistic way of thinking. In my last section I show how this is possible: objective idealism and panentheism are conceived as models that belong together and can offer a viable alternative for modern versions of materialism.
Chapter
In this short chapter, I will discuss applications of typicality beyond statistical mechanics and probability theory. On the one hand, this will emphasize the wide scope and philosophical potential of typicality. On the other hand, the appeal to probabilistic concepts is very dubious in the following examples, so they should help to further clarify the distinction between typicality and probability.
Chapter
This chapter explores the intersection of Anthropocene narratives with science-based cosmological creation stories. Grand narratives like the Universe Story offer a universal, global story that distills contemporary science into a true myth of cosmic evolution that unites humanity and bonds us with the natural world. In their diagnosis of our current global crisis, and in prophetic claims about Earth’s future, cosmic narratives parallel Anthropocene visions of a future Earth wisely managed by humans. This alignment is no coincidence, given that the Anthropocene concept shares elements of its history and development with ideas and intellectual figures that are foundational to the Universe Story. On both accounts, human evolution culminates in a cosmic turning point, when humans begin to take conscious control of planetary processes. This chapter assesses whether such narratives provide useful guidance in the current crisis, with emphasis on the problem of human-centeredness and the ethical potential of wonder.
Preprint
Full-text available
The universe, in its vastness and complexity, harbors numerous mysteries that have perplexed humanity for millennia. Among the most enigmatic of these phenomena are black holes, regions of spacetime where gravity pulls so much that not even light can escape. As we delve deeper into the study of these cosmic entities, we are compelled to confront not just their physical nature but also their philosophical implications. This paper presents an audacious proposition: could some black holes serve as tragic signposts of advanced civilizations encountering the Great Filter, a hypothetical phase in their evolution where they face a high risk of self-destruction? Exploring the interplay between black hole physics, the trajectory of technological advancement, and the Fermi Paradox, we embark on an interdisciplinary journey that oscillates between the empirical and the speculative, the tangible and the profound. Keywords: black holes, Great Filter, Fermi Paradox, cosmic engineering, extraterrestrial civilizations, high-energy physics, technological trajectory, existential risks, cosmic phenomena, gravitational singularities, interdisciplinary exploration, philosophy of science, cosmological implications, Kardashev scale, cosmic signposts.
Chapter
In the preceding chapters, several aspects of genomics were deconstructed using the formalism of involuted manifolds. It was demonstrated that several hitherto unknown features of genomic architecture could be explicated and formalized using involutive algebras. As a culmination of these insights into the nature of genomic architecture, we will formalize a topological modular framework of genomes which consists of formal higher dimensional units of genomes to be christened as “GENOTOPE.” In this chapter, we will outline an evolutionary and a functional outline of this model. This model necessarily involves a higher dimensional manifold which runs contrary to the four-dimensional spacetime in which the biological evolution has taken place. Therefore, an outline of the reasons why these extra dimensions are required and why they are connected with one another through an operator of involution will be discussed. A topological framework of natural selection using this operator of involution will be presented. We will discuss why genomes should exist both as ecosystems as well as the units of selection.
Chapter
Critical cosmopolitan orientation has usually been embedded in a non-geocentric physical (NGP), providing a contrast to the underpinnings of centric cosmologies, which see the world as revolving around a particular observer, theorist, or identity. NGP cosmology makes it plausible to envisage all humans as part of the same species. The connection works also through homology and analogy. An astronomic theory can be isomorphic with a political theory. However, the normative implications of the NGP cosmology are ambiguous. Various reactions have encouraged territorial nationalism and geopolitics. I suggest that critical cosmopolitical orientation should now be grounded on the notion of cosmic evolution, which is not only contextual, historical, pluralist, and open-ended but also suggests that humanity is not a mere accident of the cosmos.
Article
The question in the title of this essay seems farfetched, but is a way to look at the preconditions for social security at a very general level. Primitive single-cell organisms are probably relatively common in the galaxy, but it is highly unlikely that it has developed to the stage of intelligent life at any other planet besides Earth. If such intelligent aliens exist, they are likely to need some sort of social security. It is uncertain, however, if they are able to realise the social preconditions for the development of social security. In particular, it is difficult to answer the question whether they will create nation-states with sufficient control over their populations.
Article
Full-text available
Links: https://pfk.qom.ac.ir/?lang=en https://pfk.qom.ac.ir/article_2417.html?lang=en https://pfk.qom.ac.ir/article_2417_a12a5059dfe54e9fb3410447d9c0a3c2.pdf
Article
The categories of conflict, independence, dialogue, and integration have been used to organize the wealth of opinion on the relationship of science and religion (Barbour, 1997). This approach is especially useful in college or high school science courses, or special seminars, because it allows the instructor to locate his or her own opinion, and challenge students to determine their own beliefs. This effectively lays most opinions “on the table” for display in a free market of ideas. Using this approach, students can examine the strengths and shortcomings of major opinions, and it has the advantage of preserving the dignity of all religious views that reside in the science classroom, without compromising any science. The wealth and depth of new ideas that have been forthcoming from both science and theology offer the exciting promise of future wisdom that can serve both religion and science.
Article
Full-text available
The theoretical perspective on universe is the most intriguing subject of discussion. Human's curiosity made them wandered about the origin and form of the universe and because of the attempt to find those answers the idea of the parallel universe came into the light. This study has been undertaken to investigate and gain insight about some models of parallel universes. We have described models that are most fascinating. The inquiry isn't if the multiverse really exists or not but if they exists how are they like and where they might be.
Article
Full-text available
W artykule opiszę kosmiczny kontekst ewolucji biologicznej i jego konkretyzację w postaci różnych sformułowań zasad antropicznych (słabej, mocnej, probabilistycznej, ostatecznej i partycypacyjnej), aby przejść do omówienia wybranych filozoficznych oraz teologicznych konsekwencji antropiczności Wszechświata. Celem artykułu jest odpowiedź na pytanie, czy fakt antropiczności Wszechświata prowadzi do możliwości uzasadnienia tez o ontologicznym, epistemologicznym oraz teologicznym charakterze.
Chapter
The dynamic regularities of living things still demand a language of ends and a theory of power, form, and function to go with it. Ancient and Medieval formulations of souls with final causes failed, but something new had to take their place. Modern biologists embraced methodological naturalism, local adaptation, and blind chance. Though they frequently overlap, they are conceptually and epistemically distinct, requiring distinct defenses. Each places a necessary, though not sufficient, constraint on teleology within modern biology. Methodological naturalism excludes a priori agents that do not act with lawful regularity. Local adaptation reflects an a posteriori discovery that evolution is neither progressive nor linear, though consistent changes may occur under consistent conditions. Blind chance describes the sufficiency of genes and natural selection to explain evolution without invoking prospect, will, or other mental teleologies. Modern discussions of bioteleology can be improved with a recognition that the three constraints represent a Venn diagram of overlapping debates, each with its own set of explanatory and historical issues.KeywordsChanceEvolutionModern synthesisNaturalismProgressTeleology
Article
Full-text available
In this paper the authors provide an epistemological view on the old controversial random-necessity. It has been considered that either one or the other form part of the structure of reality. Chance and indeterminism are nothing but a disorderly efficiency of contingency in the production of events, phenomena, processes, i.e., in its causality, in the broadest sense of the word. Such production may be observed in natural and artificial processes or in human social processes (in history, economics, society, politics, etc.). Here we touch the object par excellence of all scientific research whether natural or human. In this work, is presented a hypothesis whose practical result satisfies the Hegelian dialectic, with the consequent implication of their mutual reciprocal integration. Producing abstractions, without which, there is no thought or knowledge of any kind, from the concrete, that is, the real problem, which in this case is a given Ontological System or Reality.
Preprint
Full-text available
In 1972, at a symposium celebrating the 70th birthday of Paul Dirac, John Wheeler proclaimed that "the framework falls down for everything that one has ever called a law of physics". Responsible for this "breakage [...] among the laws of physics" was the general theory of relativity, more specifically its prediction of massive stars gravitationally collapsing to "black holes", a term Wheeler himself had made popular some years earlier. In our paper, we investigate how Wheeler reached the conclusion that gravitational collapse calls into question the lawfulness of physics and how, subsequently, he tried to develop a new worldview, rethinking in his own way the lessons of quantum mechanics as well as drawing inspiration from other disciplines, not least biology.
Preprint
Full-text available
This Working Paper argues that nowadays the material scientific basis for knlwedge and acience in general is biology. This, however, does not entail any kind of reduccionis, Simply stated, we must know about nature. This WP is an invitation so see nature as the rationale for any disicpline or theory.
ResearchGate has not been able to resolve any references for this publication.