Science topic

# Quantum Cosmology - Science topic

Explore the latest questions and answers in Quantum Cosmology, and find Quantum Cosmology experts.

Questions related to Quantum Cosmology

Cosmological explanations for our apparently fine-tuned universe are basically divided between a) a vastly huge multiverse of universes with varying fundamental force and mass constants, including the cosmological constant (where our apparently fine-tuned universe is just one universe in this multiverse), or b) a cosmic intelligence that fine-tuned our universe at its beginning to evolve stable galaxies, life and developed minds. In scientific terms, which explanation is preferable? Are there other options? Is a cosmic mind a viable scientific hypothesis for explaining our universe's origin?

At Planck scale, the physical gluon acquires third degree of freedom in the form of scaler potential when the unphysical ghost particle effect disappears at Gribov horizon. Now, relativity demands Lorentz Fitzgerald contraction of Planck scale at light speed of gluon. But that would mean sudden demise of quantum theory. In order to unite relativity with quantum theory, the physical gluon speed instead reduces to zero in any inertial frame and accordingly exhibits mass gap property. For more details, please refer to my preprint http://dx.doi.org/10.13140/RG.2.2.25092.65926

If we think we can observe that the same quantum particle can be simultaneously in different places, so maybe that made these places are only one: a track for a holographic theory of Universe?

Dear Colleagues,

I am a liaison (informal) at my university between science and the arts. I have family in planetary astronomy but this is far afield.

LINK to VIDEO: https://news.harvard.edu/gazette/story/2020/01/largest-gaseous-structure-ever-seen-in-our-galaxy-is-discovered/

A question or two:

What does this newly-reported Radcliffe Wave of gaseous proto-stars tell us about how our galaxy originated?

Is there any chance that this wave will make some difference in our own sun's behavior?

Following the fringe concepts of the Global Consciousness Project directed by Roger D. Nelson, we propose that the REG (Random Event Generator) device be replaced by the NRCL (Non Repeatable Code Lifetime) generator to enhance the articulation of statistical anomalies possibly caused by nonlinear temporal processes. Such phenomena are succinctly described as processes of "cause preceded by effect".

**Time Particle**

Time particles can be those particles which get produce by acquired technology and will be equip with force behind structure, symmetry and patterns, and are responsible for production of an event under the design. They can perform specific work as specific event particles and capable to manage the conduction of an event.

They can be find by;

1. Natural particle responsible for time, if there is any near to said purpose

2. Manipulation of some inborn capabilities/setting in natural particles

3. Some designed particle for the said purpose.

**Event-Control Tools & Techniques (ECTT)**has ability to Change the Time

By Event-Control Tools & Techniques (ECTT), we can change the trend line of time, which also prove that time is not a physical and permanently fixed entity. Some close examples of event control via electric charges & magnetic flux might be helping with reference to cosmos under Universal Managemental System.

**What can be possible way outs regarding Event-Control Tools & Techniques (ECTT) through Particles, Waves, and Forces ?**

What you suggests.....!!

Thanks

In an old paper at IJTP, Wim B. Drees argues against Hawking-Hartle's interpretation that wavefunction of the universe implies the probability for the universe to appear from nothing. What do you think?

In order to start going into the topic, let us consider two alternatives. If we think that the photon dies out when absorbed then there is not much to talk about. However, if we consider the second alternative, lets us point out that in loosing its energy it becomes unobservable since our senses as well as all our apparatus need an energy transfer to achieve any detection.

So, if photons do survive after being absorbed they thus became ghost photons, i.e. invisible. Evidently this is problematic. But let us not dismiss so fast.

Let us make an imperfect analogy between a photon and a spring. If the spring vibrates it has an oscillatory energy. If it transfers its oscillatory energy to an external material it looses its energy, but the spring is still alive, it has not disappeared. Well, if you see the photon as an oscillator then the analogy makes some sense.

Let us address now a still more controversial issue. Let us suppose that if the spring is not stressed it has no strain mass. But if it is vibrating it has then just energy without having mass, and this analogically applies to the photon.

Well, let now consider the case of a stressed spring that is vibrating. It has then mass and energy. Again, analogically this applies to massive elementary particles.

Why should we appeal to very complicated models and theories? Is it really worthy?

Those interested in this viewpoint and willing to go deeper into this issue may read the paper: “Space, this great unknown”, available at: https://www.researchgate.net/publication/301585930_Space_this_great_unknown

Article Space, this great unknown

What additional quantum number has the antiparticle that annihilate it? The sign of its Ricci scalar curvature in the non-Euclidean space-time?

Quantum gravity

Assume we have two identical clocks made on the earth. One of them is sent to other corner of universe (I mean very far). The clock starts to tick simultaneously (Maybe it is not practical in practice, but lets assume it is possible for now). Then, are both of them measure same value of time or there is possibility that one of them measure slowly or faster than other? Let assume that these two clocks are stationary relative to each other and space-time for both of them are flat not curved.

Laboratory observation of a freely spinning steel ball apparently reveals rotational drag as predicted by James C. Keith:

*"It is as if entire reaction force on the universe, a universe which cannot itself react to forces and torques inertially, acted back on the freely spinning mass system causing a real slowing down."*, cf. page 11 of appended reference.If the cosmos was created out of nothingness in the Big Bang, what determined the size (or scale) of fundamental particles, like the proton and the electron? In the past people thought that GOD determined their scale, but this is not a scientific answer. So, what may have determined the scale of material existence?

Erik verlinde said; this emergent gravity constructed using the insights of string theory, black hole physics and quantum information theory(all these theories are struggling to take breath)..

**its appreciation to Verlinde of his dare step of constructing emergent gravity based on dead theories**..we loudly take inspiration from him...!!!!!!!GTR does not predict a bifurcation in the graviational wave at the chirp. Is there something wrong with GTR

Where the border in the space strictly separating the expansion of the space from the expansion within the space.

Professor Michael Longo (University of Michigan in Ann Arbor) and Professor Lior Shamir (Lawrence Technological University) on experimental data have shown that there is an asymmetry between the right- and left - twisted spiral galaxies. Its value is about 7%. In the article:

ROTATING SPACE OF THE UNIVERSE, AS A SOURCE OF DARK ENERGY AND DARK MATTER

it is shown that the source of dark matter can be the kinetic energy of rotation of the space of the observed Universe. At the same time, the contribution of the Carioles force is 6.8%, or about 7%. The high degree of proximity of the value of the asymmetry between the the right- and left - twisted spiral galaxies and the value of the contribution of the Carioles force to the kinetic energy of rotation of the space of the observable Universe is a strong indirect evidence (on experimental data!) that the space of the observed Universe rotates.

This question makes a new gate to think the relation between results of Last fermat theorem geometrically and its relation with Quantum mechanics , Curvature of Space-time , special and general relativity and other issues in physics and in general about our whole universe.All ideas , opinions , questions , links , papers and work reports will be acknowledged.

We say a finite elastic surface is expanding when points on the surface are moving apart at different times. Therefore a moving object of constant speed will take longer time to traverse between two known points on the surface than an earlier time. We read and fascinated by theories about cosmology and the universe. One of such theories is that the universe has no boundaries and is expanding, sometimes with a constant rate, other times with a faster rate than what we thought. Its expansion is observed from the observation of increased separation between known cosmological objects increases over time.

Earth and other planets in our galaxy are cosmological objects which should obey the same law and display similar behaviors, that the time these objects take to traverse a cosmological curve on their natural path of either rotation or revolution will be longer than it took them some cosmological time ago, unless the speed of revolution or rotation speed of these objects always change accordingly so that the time length remains the same. Therefore the length of time earth takes to complete a cosmological path of revolution around the sun which we call it one year or 365 days has to change, while the time of rotation may remain the same as it looks the time of rotation is invariant of the expansion of the universe unless earth itself increases in size. My question is :

Is it observed that the time of revolution for earth increased to be more than 365 days and we have to change what we call one year ? What is really changing and what is not and which behaviors are affected by these changes? Is such a theory justified by empirical and unchanging evidences we encounter?

I appreciate your ideas.

Best regards,

Dejenie Alemayehu Lakew

The tests of GPS systems and the Hafele and Keating experiments suggest, that an Earth centred position is stationary.

After all we do not alter Earth's clock we alter the satellite clock, there is no reciprocity

The new postulate 1, is that all stable gravitational bodies are stable with regards the back ground of dark energy ( which is how they came to be in a stable orbit), and are stationary observers.

Can we treat this as a new postulate, instead of all observers are equal in STR, all stable gravitational objects are stationary. If not what are the potential objections?

The uncertainty principle is said to forbid the existence of electrons in the nucleus as its constituents. They are formed just prior to or during beta decay.

the energies which uncertainty principle predicts for an electron is in the range of 10 Mev or more, while the observed maximum beta-energies are about 1.1 tp 1.3 Mev for free neutron decays, and may be ~ 5-6 Mev for some nuclei. The closeness of values renders the argument rather weak, since the electrons will spend some energy in overcoming the attractive coulomb potential of the nucleus and may come out with less energy even if it originally had the energies as required by uncertainty principle.

An article from Nature "Undecidability of the spectral gap" (arXiv:1502.04573 [quant-ph]) shows that finding the spectral gap based on a complete quantum level description of a material is undecidable (in Turing sense). No matter how completely we can analytically describe a material on the microscopic level, we can't predict its macroscopic behavior. The problem has been shown to be uncomputable as no algorithm can determine the spectral gap. Even if there is in a way to make a prediction, we can't determine what prediction is, as for a given a program, there is no method to determine if it halts.

Does this result eliminate once and for all the possibility of a theory of everything based on fundamental physics? Is Quantum physics undecidable? Is this an an epistemic result proving that undecidability places a limit on our knowledge of the world?

There are different views on this issue. Newton believed that the distinguished reference frame was associated with the absolute space and time. When a body is accelerated relative to such a reference frame, the phenomenon of inertia arises as the body’s resistance to the accelerating force. According to Mach, the distinguished reference frame is associated with the distribution of matter in the Universe. According to Einstein, the distinguished reference frame does not exist. In my opinion, the distinguished reference frames can be determined by the reference frames, in which the graviton fluxes at the current moment are spatially isotropic. We can arrive at this conclusion by studying the Le Sage’s theory of gravitation, the extended special theory of relativity, and the Lorentz-invariant theory of gravitation.

Fröhlich proposed that when proteins absorb a terahertz photon the added energy forces the oscillating molecules into a single, lowest-frequency mode which supports the Orch-OR theory of consciousness by Hameroff and Penrose. In contrast, other models predict that the protein will quickly dissipate the energy from the photon in the form of heat. This was challenged by Reimers and group in 2009, where the group demonstrated that no amount of mechanical energy can produce a coherent Fröhlich condensate. But in a recent 2015 study, Katona and his colleagues concluded that the long-lasting structural changes that were observed in the helical structure of the lysosome crystal could only be explained by Fröhlich condensation, a quantum-like collective state in which the molecules in a protein behave as one.

The Yukawa term involves a product of fermions and scalars and so in particular it contains the Higgs field, meaning that the Yukawa term should also take place beside the usual quadratic and quartic terms in the full Higgs potential; and thus it should contribute to shift the minimum of the full potential, affecting the vacuum that gives symmetry breaking: but then why is it never considered?

*According to Albert Einstein:

- Speed of light is constant for all inertial reference system.
- laws of physics same for all inertial reference system.

Therefor we have most profound and elegant formulation to space and time!

*According to quantum physics: There are five wage postulates . One of the postulate called ,Pauli exclusive principle lead to conclude that state of spine of electron must be simultaneously synchronized (instantaneously communicate each other !.In deed have no communication time delay for communication !

My question is : about the communication between each electron s ;

- Brake speed of light?
- Mathematical axioms of quantum mechanics need kind of a artificial abstraction like spooky action at distance to overcome it weakness?

Timothy Boyer had also asked similar question and substantially published papers explaining quantum results without quantum hypothesis. I agree with Boyer but he did not actually derive the quantum of action i.e.h or h divided by 2 pi. Although seems doubtful even Quantum cosmology has an expression for action. Doubtful, because this action can also be interpreted as time dependent. So is Planck's constant invariable?

Dark energy is believed to be an invisible force that behaves contrary to what gravity does, to keep matters to go apart in a certain rate than to attract to each other. We can see that by simply observing the separation of galactic objects increases as time pass by.

Is k is the particular parameter that enables the Sobolev space or the generalized Hilbert space W^{k,2}(Ω) to expand, i.e, k is the dark energy, where k∈ℕ∪{0}.

On the Wikipedia page about the principle of equivalence, it is written:

<< Einstein combined (postulated) the equivalence principle with special relativity to predict that clocks run at different rates in a gravitational potential, and light rays bend in a gravitational field, even before he developed the concept of curved spacetime. >>

In a space of Minkowski, we know how to express that a trajectory is or is not a straight line. To do this it is enough to use the elements of the formalism.

But when an elevator (no matter it is very small) is accelerated compared to a Minkowski space, what are the mathematics that are used to state that, according to an experimenter inside the elevator, a trajectory is or is not a straight line?

Because it is taught that in rigorous mathematics of general relativity (which studies the accelerated motion), the expression "being in motion relative to a particular experimenter" is not part of the authorized vocabulary, for what exact size of the elevator the experimenter inside can consider he is in a portion of a Minkowski space?

When we imagine that any transformation defined from a small portion of a Minkowski space, which leaves invariant the speed of a light beam on its path, is inevitably a limitation of the group of Poincaré to the portion in question, the mind is forced to improvise particular objects in order to interpret reality and go beyond special relativity.

But when we establish that the transformations which are defined from a small portion of a Minkowski space, and which leave invariant the speed of a beam of light on its trajectory, are not limited to the restrictions of the Poincaré group and reveal solutions which are quantified in a certain sense, these solutions being susceptible to describe an accelerated motion, a question arises insistently :

In the Einstein's accelerated elevator with transparent walls, would it be possible that the experimenter inside notices that the speed of a beam of light is always constant and equal to the invariant of the restricted relativity ?

If the photon had mass, then the Faraday's and Ampere equations would pick up an additional term related to the mass of the photon. This term will give rise to the Hall effect exhibited by semiconductors when a magnetic field is applied on. The mass of the photon will accordingly equal to m=(I hbar/Qc^2), where I is the current passing on the sample, Q is the total charge enclosed by the sample, and hbar is the Planck's constant divided by 2pi. Any challenges to measure this?

This can be transformed into a relation m=hbar v/(Wc^2), where v is the electron drift velocity and W is the width of the sample.

In QFT we calculate quantities that ultimately depend on the way we normalize spinors; but because in perturbation theories spinor propagators are free then their normalization can be chosen arbitrarily: so in perturbative QFT computations give results that depend on arbitrary normalization. It is custom to choose the spinor square scalar equal to 2m and this gives the correct results; but for instance in condensed state physics the normalization would be to choose it equal to the number of particles, which in QFT would result in choosing it equal to 1 and this would be grossly out of scale. So given that 2m seems correct but not the only possibility, is there a way in which such a normalization could be justified a priori?

As the Sun ages, its total output rises. To maintain a constant energy input to the Earth, it will eventually become necessary to slowly increase Earth's mean orbital radius if it is to stay habitable. While engineering this is shift is difficult but possible, a complication is that the Earth is in 8:13 resonance with Venus. Is there any (free) tool available that could determine the stability of the planetary orbital configuration as a function of a varying Earth orbital radius?

Hi everybody. I would like to plot this function in phase space.

W(x,p

_{x},y,p_{y})=ⅇ^{-(x^2+p}_{x}^{ ^2 )-(y^2+p}_{y}^{^2 )}*LaguerreL[1/2(-1+n), 2(x^2+p_{x}^2)] * LaguerreL[1/2(-1+n),2(y^2+p_{y}^2 )]Hello, I have one simple question about the double split experiment.

Is the interpretation for single photon interference the same with single electron interference, the quantum mechanics?

There's some other explanation that the photon can travel through natural wormholes.

I'm quite confusing about this.

Thanks!

Is there a quantum origin of time asymmetry as manifested in thermodynamics?

How did the experimentalists of CERN provided us with the decay width to mass ratio of the 750 GeV resonance? As far as the observation is concerned they only saw an excess in number of events in two particular bins of diphoton invariant mass. Have they fitted some curve corresponding to standard resonance shape to it? It was reported in the paper by ATLAS that best fit was obtained for : width/mass = 0.06. But the argument in favor of narrow width is also seen in literature as well. So, what is the matter with the width?

The Schrodinger experiment (intended to illustrate what he thought was the implausibility of a half-live half-dead cat state function, but now taken seriously by many) is modified to examine the question of whether physical processes collapse the wave function, or whether consciousness is required as I understand von Neumann suspected.

The AI (artificial intelligence) is not assumed to be conscious, just a sophisticated but deterministic program, or expert system, with motors attached robot-like. We assume from quantum mechanics calculations that the room contains a state function which is a 50-50 live-cat, dead-cat. When we open the room we expect to find one of the following:

- Live cat, with AI having recorded an observation of opening the smaller box and finding a live cat.
- Dead cat, with AI having recorded an observation of opening the smaller box and finding a dead cat.

There is nothing to collapse the wavefunction until you and I open the box, according to von Neumann. As I understand him. The AI is a physical process, just like the cat's internal biological processes are physical, and if the cat itself doesn't collapse the wave function, neither can the AI.

However, notice that the AI has the same subjective experiences that we do. There is no cross-state mixing between the AI and the cat. The AI which found the live cat never mixes with the dead cat state, and vice versa.

There, in an interview with the AI, it will insist that it never found any contradiction to the notion that it collapsed the wave function, even though our mathematics informs us otherwise.

In loop quantum gravity the eigenvalues of volume operator are discrete. Is it correct to conclude that , for fermions, the density of particle number is always finite? (according to the Pauli exclusion principle)

Atom optics is typically done in the last years with gratings of light. These gratings successfully implement mirrors and beam-splitters. Two counter-propagating laser beams cross one another, and in the crossing region there appear fringes - see figure.

For the case when the beams have the same wavelength it is no problem to calculate the grating constant, it is equal to 1/(2

*k*cos*θ*), where*θ*is half the angle between the directions of propagation of the two beams.However, recently appeared in the literature "moving gratings". That means, the two laser beams differ slightly in wavelength. Such an arrangement is said to produce the effect of a moving grating.

How to calculate the grating constant of this moving lattice, and with which velocity does it move with respect to the static emitting lasers?

As a theoretical cosmologist, spectral index, running spectral index and scalar to tensor ratio are some of the things one can easily calculate for the model we devise. But it comes very handy to plot them on top of the distribution that is usually obtained by running CosmoMC on Planck data. Can someone describe how these plots are obtained and what is quickest way to reproduce the such as fig.9 in the link attached, without divulging oneself into the details of Planck data analysis.

The Harmonic coordinates, often used by Vladimir Fock when treating GRT, derive from the D'Alembertian, the Laplacian in the Minkowski space. The wave equation for the electromagnetic field in vacuum is Dalembertian(A

^{μ})=0where A

^{μ}is the electromagnetic four-potential (Vector + scalar potentials).Should GR, in order to be in agreement with EM, use this kind of coordinates?

Since the 3D graphics in order to operate the motion of 3D objects uses such coordinates transformations and these are

**very successful in modelling the proper deformation**of solid figures in motion, what is the implication and deep meaning of these coordinates?Gravitational wave has now becomes a reality. It is analogous to the electromagnetic wave that is emitted by an accelerating charged object. What type of wave is it? A tensor, a vector or a scalar wave? With what speed does it travels? It composes of what fields and their relation to each other? How a tensor wave travels? Does it arise from dipole or quadruple radiation? What is its frequency range? Does it interfere with light? Any other physical properties does it carry?

Can we apply extreme conditions in order to create an artificial nucleus just made out of neutrons?

Different physicists disagree on whether there is such a thing as the wave function of the universe.

- In favor of its existence is the fact that, in the Big Bang picture, all particles (and hence downstream objects) were correlated at the inception of the Universe, and a correlation that has existed at some point in the past ever so loosely continues thereafter since full decoherence never truly sets in. A number of pictures - Ghirardi-Rimini-Weber, Bohm, even Hugh Everett, et al., - require the existence of the wave function of the universe, denoted Ψ(U).

- Two main categories of objections however belie its existence.

The first category ultimately boils down to a not very solid rejection of non-separability, i.e. to an argument that a full separation between an observer and an observed must always be upheld if any observation or measure is to be objectively valid, and a wave function ascertainable.

The second argument is more compelling, and says that if Ψ exists, then Ψ(U)=Ψ(Ψ,U) in a closed, self-referential loop. Ψ has thereby become an non-observable, unknowable, and as such better relegated to the realm of metaphysics than physics.

What say you?

Cosmic inflation is usually considered to have ended at about 10^-37 seconds after initiation of the big-bang. Given this, what is the meaning of the term "eternal inflation"?

We know that Classical Mechanics can be shown as special and approximated case of General Relativity and Quantum Mechanics.

Is the case similar here?

Researchers in the fields of cosmology and gravitation, which are also expert at high energy physics .

While scientific cosmology rarely occurs in the work Karl Popper, nevertheless it is a subject that interested him. The problem now is whether falsifiability criterion can be used for cosmology theories.

For instance, there are certain issues in cosmology which have never been refuted, but instead the same methods are used over and over despite their lack of observational support, for instance mutliverse idea (often used in string theory) and also Wheeler DeWitt equation (often used in quantum cosmology).

So do you think that Popperian falsifiability can be applied to cosmology science too? Your comments are welcome.

There are AIs using time-domain echo grating for g measurements, so I was wondering if anyone happens to know if there is any information on frequency domain echo grating AIs on g measurements. Is it even possible? Any published articles would be helpful.

In most of the inflationary models, or quintessence models, a scalar field has been used to describe the dynamics. Why is this? I know that some models use vector fields, but if you want to use scalars, what are the arguments behind using it?

Does it imply that if the theory did not allow calculating values of the given quantity in reasonable time, then this theoretical quantity would not have a counterpart in physical reality? Particularly, does this imply that the wave functions of the Universe do not correspond to any element of physical reality, inasmuch as they cannot be calculated in any reasonable time? Furthermore, if the ‘computational amendment’ (mentioned in the paper http://arxiv.org/abs/1410.3664v1) to the EPR definition of an element of physical reality is important and physically meaningful, should we then exclude infeasible, i.e., practically useless, solutions from all the equations of physical theories?

Can we get any help of dark energy momentum tensor in f(r,t) theory?

Cosmological interactions are often related to the energy density of the interacting system or to their time derivatives. Is it possible to relate the interaction between spins of a fluid with its energy density? Or with an arbitrary function of its time derivatives?

In other words, can a cosmological interaction be derived from the interaction between spins of a cosmological fluid ?

A quantum gravity theory based on the concept that a unified field of consciousness (ufc) permeates the entire creation, provides solutions to many of the puzzles. The biggest of them is the origin of energy. UFC is a perfectly motionless field which does not interact with energy. When ufc becomes active, it gives out discrete quanta of life which physicists call energy. The process is reversible. Neither ufc nor energy can ever be destroyed. More details on ufc in the article at following link.

Article Unified field of consciousness

I am not an expert at all in cosmology, and I was always puzzled by the following question. To my knowledge, everything started from a singularity (Big-Bang), then inflation followed, then space continued to expand. However, after the end of inflation, it is now believed the whole universe was of the scale of centi-meters (still infinitely large than a single point). The question I have is this: suppose at the end of the inflationary period (i.e. ~10^-32s after Big-Bang), one shines a photon. What will happen with it? Will it encounter a "boundary" of space? Or is the space still expanding faster than the speed of light and the question is meaningless? And, finally, can we say that the universe itself had a finite volume?

Thanks!

Is it, in principle, possible that the redshift observed by Hubble can be explained by compton scattering of light on some particles like dark matter, neutrinos, etc?

This would also mean that, if a light emitting object is far away, it's light will be shifted more than the light of a close object, since more particles are in the way.

I once read an article proposing this mechanism, but I don't remember the source. Is this a widely known problem? Where can I read more about it? Are there other possible explanations of the redshift besides the expansion of the universe?

A quantum gravity theory based on the equivalence of gravitational and relativistic mass is developed from Newton's inverse square law which assumes a form of Schrodinger like wave equation. Solution of this wave equation generates the entire table of standard model particles.Is there a comparable theory that can do the same thing?

I read somewhere that Einstein's equations may be expressed in terms of Klein-Fock-Gordon equation, but i am not sure yet how to do that.

In a paper, Fiziev and Shirkov discuss solutions of Klein-Fock-Gordon equation and its implications to Einstein's equations. In effect, this may imply that Einstein's equations have wave-type solutions.

What do you think? Your comments are welcome.

The origin of structure in the universe is one of the greatest cosmological mysteries even today. Extended topological objects such as monopoles, strings and domain walls may play a fundamental role in the formation of our universe. Phase transitions in the early universe can give rise to these topological defects. A topological defect is a discontinuity in the vacuum and can be classified according to the topology of the vacuum manifold. Monopoles are point like topological defects and are formed where M contains surfaces which cannot be continuously shrunk to a pointy i.e. when π2 (M) ≠ I. ( M is the vacuum manifold ) [1] one of most important works about Abelian gauge theories was due to the P. M. Dirac many years ago, who proposed a new solution to the Maxwell equations. His new solution for the vector potential corresponds to a point-like magnetic monopole with a singularity string running from the particle’s position to infinity [2]

[1] F. Rahaman, S.Mal and P. Ghosh; A study of global monopole in Lyra geometry

[2] A. L. Cavalcanti de Oliveira ∗ and E. R. Bezerra de Mello; Kaluza-Klein Magnetic Monopole in Five-Dimensional Global Monopole Spacetime

If the universe appeared out of the vacuum due to vacuum fluctuations, is not entropy reduced? And if entropy is reduced, does this not require an external source of energy?

As far as I know in order to explain the horizon problem in cosmology, two major theories have been put forward viz, inflation and VSL (varying speed of light). My question is this, by detecting primordial gravity waves by BICEPS 2, thus proving the inflation theory, what happens to the VSL theory? Does the VSL theory also predict B-mode polarization? Sorry if my question is amateur, I would appreciate it if you could enlighten me on this topic.

If positron is a reality, then, if we bombard the positron with electrons, is a black hole formed or into which form of energy is it transformed?

For those who take that view - can you then please walk us through what happens in the wake of a massive supernova burst ? All the scenarios I see lead back to a black hole, and I'd be really interested in a cogent alternative scenario.

A preference for spiral galaxies in one sector of the sky to be left-handed or right-handed spirals has indicated a parity violating asymmetry in the overall universe and a preferred axis.

Could the large-scale magnetic field be related to the predominant left-handed neutrinos in our cosmic sphere as well?

How does gravity effect everything regardless of its mass? Or is their a smallest particle common in everything that is effected by gravity?

Assuming that the Big Bang theory reflects the actual chronology of "The Birth of the Universe", imposes the question of whether The Big Bang makes sense without existence of The Initial Singularity? If not, then The Initial Singularity Theory grows to the fundamental problem of "The Birth of the Universe".

Thus, what is the place of The Initial Singularity Theory in the chronology of "The Birth of the Universe"? Does Initial Singularity existed before the Big Bang only as a philosophical idea, or only a mathematical boundary condition - in other words - only as the zero point, which arose just to get at the same time explode?

If Initial Singularity does not only the zero point, then I asked the question: How long the Initial Singularity existed before The Big Bang?

Each time a new class of lasers was allowed to produce shorter light pulses, people were wondering whether a time quantization would be observed. The smallest possible quantization is Planck's time 10^-44 s. Some nuclear resonances have been reported having an energy width suggesting a Fourier transform limited duration of 10¨-26s.

When Roger Penrose wrote his first book on consciousness (The Emperor's New Mind, 1989), he lacked a detailed proposal for how quantum processing could be implemented in the brain. Nevertheless, he suggested that objective reduction represented neither randomness, nor the algorithm-based processing of most physics, but instead a non-computable influence embedded in the fundamental level of space-time geometry, from which mathematical understanding and, by later extension of the theory, consciousness derived (http://en.wikipedia.org/wiki/Quantum_mind).

Moreover, he gave a talk in those years On the Origins of Twistor Theory, elaborating that the role of complex numbers in quantum theory had long struck him as a quite crucial one. "If the 'correct' geometry for the world is to be a closely quantum one, then these same complex numbers must be an essential part of this geometry. My training as a (largely pure) mathematician had taught me something of the power, subtlety and elegance of complex (holomorphic) geometry. It had seemed fitting that this might be the geometry most basic to the structure of the physical world. Yet in its most obvious manifestations, physical geometry seems to be geometry over R, not C." (http://users.ox.ac.uk/~tweb/00001/#04)

And further down he continued: "Of course the possibility of simply describing things in terms of complexified (compactified) Minkowski space CM had occurred to me but - for reasons which are still not entirely clear to me - I had (correctly)* rejected this as insufficiently subtle for Nature. I think that one reason for being unhappy with CM as playing a primary role in physics was that the complexification is far too gross. As many additional "unseen" dimensions (namely four) would need to be adjoined as are already directly physically interpretable." (http://users.ox.ac.uk/~tweb/00001/#07)

(=> perhaps, in the present climate of eleven-dimensional generalized Kaluza-Kiein theories, this objection would carry little weight with most people. However, to me it was, and still is, a fundamental drawback).

In this respect, our views differ from those of Roger Penrose: