Q&A
Find answers to technical questions and follow scientific discussions
Discussion
Started 27th Apr, 2021

Are Dr. Hans-Otto Carmesin Models supported by the Supernova Observations?

Dr. Hans-Otto Carmesin is a prolific theoretician who wrote among other things, these two books:
Modeling SN1a data:
That said, he leads a field where a lot of unsupported claims are tossed around without anything to support it. That is why they are unsupported..:)
As Dr. Carmesin professed, scientists should follow the teachings of Aristotle and always use the simplest possible model that is consistent with Reality.
Dr. Carmesin's model has nonlocality, dimensional transitions, the usual suspects (Dark Matter and Dark Energy), and an epoch-dependent Dark Energy (figure 8.15 on the first book above).
It is a fantastic work and from my point of view, unnecessary and incorrect.
Unnecessary because there is HU which is capable to explain everything Dr. Carmesin explained without the need for a Big Bang, Dark Energy, Dark Matter, epoch-dependent Dark Matter, Polychromatic Vacuum. Because of that, Aristotle and Occam's Razor would support HU and rebut Dr. Carmesin's work.
Attached is my summary of the problems I found on Dr. Carmesin's claims that SN1a distances support his work.
#########################################
#########################################
#########################################
This is an ongoing discussion.
Dr. Carmesin provided a reply to my objections and confirmed that he is not sure if his model can predict the SN1a distances.
In fact, he said: "My theory does not fail to predict these distances. I just did not calculate these distances yet for a good reason: I tested my full theory by calculating the measured Hubble constants of the Hubble tension."
First, that is not a good reason. Second, I calculated the distances according to his model and the model failed. See the plot and the attached python script.
#########################################
My plot of his model showcases that the model fails to predict the observed distances.
I also drive home the fact that Dr. Carmesin's model modifies the meaning of H0 (the Hubble Constant). Because of that comparison of results are not straightforward and seems to not have been considered before.
The plots also show that HU model predicts the observed distances without any parameters.

Most recent answer

28th Apr, 2021
Marco Pereira
Rutgers, The State University of New Jersey
Dear Hans-Otto,
I thank you, in the name of all the readers, for your books and wisdom.
I also derived Quantum Gravity and offered everyone these articles.
I remind you that my work has no parameters and that my prediction for the G-dependence of the Absolute Luminosity yielded a G-factor that was off just by 11% from the observed.
My Quantum Gravity theory predicts the maximum density inside a Black Hole and creates Matter directly from deformed space.
Here is the maximum density inside a Black Hole:
I also predicted the position of Earth in the Hyperspherical Universe and replicated the CMB observations (together with the spherical harmonic spectral decomposition). I did that using interdimensional hyperspherical harmonic spectral decomposition, after a grid search for the best location. Here is the grid search:
Here is Planck's CMB observation:
and here is the hyperspherical harmonic acoustic spectral simulation of the same:
at Earth's position:
χ= 339.46 degrees
θ = 341.1 degrees
ϕ= 104.08 degrees
More details here:
Here is the Equation of State of the Universe:
Here is the 3D Map of the Observable Universe:
CENSORSHIP
My theory has been published since 2007 and it has been censored at Los Alamos archives and mainstream journals (including the one where Dr. Amendola is the editor)!
You have your voice. You are allowed to publish your work. I am not.
I have a story to tell, one that is distinct from the story you tell and that everyone wants to hear.
Can Scientists handle that? Science should be able to do so.
I would like you to offer to be my endorser at Los Alamos Archives.
Best Regards,
Marco Pereira
PS- Please confirm that your theory failed to predict the SN1a distances and please provide me with its E(z).
When reading my work, please disregard the SDSS data analysis. I retracted that part. I have no problem accepting my mistakes when I recognize them as mistakes.

All replies (4)

27th Apr, 2021
Hans-Otto Carmesin
Universität Bremen
Dear Marco,
I reply to your three objections:
In your FIRST objection you criticize that I use the value H0 = 67.36 km/(s*Mpc) of the Hubble constant at z = 1090 (see e. g. Planck Collaboration 2020).
My reply to it is as follows: In my theory I derive properties of the universe from the beginning at the Planck scale as a function of the time.
In order to compare with observation, my theory describes the actual state of the universe. Correspondingly, I need an indicator for the actual time after the Big Bang. For it I use that value of the Hubble constant H_0(z=1090). (Equivalently, I could use the age of the universe as an input, of course.)
In your FIRST objection you also criticize that I use a definition of the Hubble constant that can describe the OBSERVATIONAL FACT of the HUBBLE TENSION.
(For that fact see e. g. Riess, Adam G. and others (2018): TYPE IA SUPERNOVA DISTANCES AT REDSHIFT > 1.5 FROM THE HUBBLE SPACE TELESCOPE MULTI-CYCLE TREASURY PROGRAMS: THE EARLY EXPANSION RATE. ApJ, 853, 126. Another reference is: Scolnic, D. M. and others (2018): THE COMPLETE LIGHT-CURVE SAMPLE OF SPECTROSCOPICALLY CONFIRMED TYPE IA SUPERNOVAE FROM PAN-STARRS1 AND COSMOLOGICAL CONSTRAINTS FROM THE COMBINED PANTHEON SAMPLE. ApJ, 859, 101.)
The definition that I use is in full agreement with the usual theory.
(See e. g. Riess, Adam G. and others (2018) above. Another reference is Amendola, Luca (2021): Lecture notes: Cosmology. University of Heidelberg, http://www.thphys.uni-heidelberg.de/~amendola/teaching.html.(download 2021). Moreover I explained it in my books, see below and see in https://www.researchgate.net/publication/351093848_H0_20210424pdf).
In your SECOND objection you suggest my full quantum mechanical and relativistic theory would fail to predict SN1a distances.
My reply is as follows: My theory does not fail to predict these distances. I just did not calculate these distances yet for a good reason:
I tested my full theory by calculating the measured Hubble constants of the Hubble tension.
(See my new book https://www.researchgate.net/publication/350373240_Quanta_of_Spacetime_Explain_Observations_Dark_Energy_Graviton_and_Nonlocality, see the cover for instance. See also my books below and my H0_20210424 – material above.)
As a result I find precise accordance with actual (2017-2021) observations of the observed H0 values, including the SN1a values of H0.
I emphasize that this test is much more critical and meaningful than the test with the SN1a distance moduli (from the year 2011) that you suggested to me. Indeed, I showed already for the case of my semiclassical theory that it is easier to obtain accordance with those distance moduli values that you suggest than to obtain accordance with the Hubble tension values.
My reply to your THIRD objection is as follows:
Indeed, my theory predicts essential parameters. Everybody can read it in my DIGITAL OPEN SOURCE book (English Part):
You mentioned sigma_8. I did not publish any derivation of sigma_8 yet, as I clearly wrote in our discussion on April 17th, 2021.
If you are interested in my more details of my theory, please read my books or papers, see e. g.:
or
or
or
or
or
or
27th Apr, 2021
Marco Pereira
Rutgers, The State University of New Jersey
Dear Hans-Otto,
Please let me make this point. When reviewing a theory, one should if possible take a step back (preferably outside of the theory). That is necessary to properly evaluate the hypotheses used, the quality of the predictions or fittings.
###########################################
PARAMETER INTERCHANGEABILITY
###########################################
You said: The definition that I use is in full agreement with the usual theory.
My answer: NO, if you consider L-CDM as the usual theory
###########################################
You, Amendola, and the gazillion theorists working on this problem wove a wide web of connected models and might have created a House of Cards. From my point of view, the data does not warrant the level of speculation since I provided BETTER predictions or fittings with just inertial motion and a better topology.
I am an outsider and I can see that H0 is a short-distance model (linearization for small z). It comes directly from d(z)=c/H0*Integral_0^z(1/E(z))
$E(z) = \sqrt{\Omega_r*(1+z)^4 + Omega_{m}*(1+z)^3 + Omega_K*(1+z)^2 + Omega_{L_c} }$
for the Friedman Model E(z=0) is ONE.
That is a requirement otherwise instead of having v=H0*d, you will have v=E(0)*H0*d
Your model has an E(0)=1.12 (if I am correct).
I still need to know where is Omega_{lambda,coor}(z). I tried to find it out and couldn't find it.
I don't know who did it and when the requirement to make E(0)=1 was lifted and that is not important.
What is important is that it changes the definition of Hubble Parameter. Hence you cannot compare H0 from Friedman fittings (L-CDM) with your predictions or fittings.
If you insist that a model has parameters and if they have the same name, you can interchangeably use them, then you have to say, that your models will always fail for short distances.
Compare that will my theory which only has two parameters (H0 and a G-factor that expresses the epoch-dependence of G).
Even when you consider Aristotle's precept, a model that has a single factor (G-factor is predicted within 11%) is preferable that a model that depends upon Omega_m, Omega_L, sigma_8, H0, and a very large number of assumptions, including epoch-dependent Dark Energy, Dimensional transition, False Vacuum, etc).
###########################################
WHY HU IS BETTER
The theory also explains the creation of the Universe with just the Heisenberg Principle and no explosions are required. The Universe expansion does not care about its contents.
In my theory, all Dark Matter observations are explained just with the topology (Lightspeed Expanding Hyperspherical Universe topology) and the receding galaxies are explained as INERTIAL MOTION in 4D (no forces are required).
In addition, have an epoch-dependent G is consistent with astronomical observations.
###########################################
WHY THE H0 TENSION TELLS YOU THAT THE MODELS ARE JUST BAD
In addition, the fact that different observations (SN1a, CMB, BAO) require different parameters and H0 should be enough to tell you that the model is wrong.
###########################################
WHY CARMESIN MODEL IS PROBLEMATIC
Expanding space, a Multichromatic False Vacuum, Epoch-dependent Dark Energy is not supported by anything other than the model itself.
###########################################
QUESTION:
I asked more than one time if your main model failed (as shown in my picture). I really need to know the answer to this question. That is important. If the model fails to predict distances, your whole theory has no support of any kind.
If I made a mistake, I should correct it, so please answer this question.
###########################################
###########################################
WHAT WOULD ARISTOTLE DO?
In other words, HU is cleaner, simpler, and consistent with observations and makes better and broader predictions.
###########################################
###########################################
With respect to your second answer: "In your SECOND objection you suggest my full quantum mechanical and relativistic theory would fail to predict SN1a distances.
My reply is as follows: My theory does not fail to predict these distances. I just did not calculate these distances yet for a good reason" I tested my full theory by calculating the measured Hubble constants of the Hubble tension.
MP Answer: I calculated the distances and plotted the results and the theory fails.
Your measured Hubble Constants are not properly comparable since they don't have the same definition (Hubble definition from the Hubble Law).
###########################################
###########################################
###########################################
With respect to the answer to the third question: You provided the wide web you wove. Instead of reading n books, m papers, I would rather focus on what supports the theory and if there is a simpler theory that can explain what your theory explains.
Best Regards,
Marco
PS - Where is the Omega_{lambda, corr}(z). Please provide me with that answer. That is crucial for me to know if I have properly implemented your full model.
Here are the python notebooks
Using your H0 of 67.36
Showcasing my theory (HU):
28th Apr, 2021
Hans-Otto Carmesin
Universität Bremen
Dear Marco,
I remind what I offer to you and to each interested reader:
Based on quantum physics and general relativity, I derived a theory of quantum gravity.
As a test, I applied that theory to observation: For observed values such as mu_observed and H0_observed. By using my theory, I derived the corresponding values such as mu_theoretical and H0_theoretical. The comparison showed that these theoretical values are within the errors of measurement of the corresponding observed values. I executed similar tests for many other observations, of course.
(I documented the whole procedure in my papers and books including supplementary resources, such as graphs and program codes. For instance see the following:
My OPEN SOURCE book
with the supplementary resources for distance moduli
and the program code for these distance moduli
moreover with the supplementary resources for H0 values
and the program code for these H0 values
See also my OPEN SOURCE book (English Part)
You can find further material in my profile at ResearchGate and in additional books.)
Thank you for your interest in my research.
Kind regards,
Hans-Otto
28th Apr, 2021
Marco Pereira
Rutgers, The State University of New Jersey
Dear Hans-Otto,
I thank you, in the name of all the readers, for your books and wisdom.
I also derived Quantum Gravity and offered everyone these articles.
I remind you that my work has no parameters and that my prediction for the G-dependence of the Absolute Luminosity yielded a G-factor that was off just by 11% from the observed.
My Quantum Gravity theory predicts the maximum density inside a Black Hole and creates Matter directly from deformed space.
Here is the maximum density inside a Black Hole:
I also predicted the position of Earth in the Hyperspherical Universe and replicated the CMB observations (together with the spherical harmonic spectral decomposition). I did that using interdimensional hyperspherical harmonic spectral decomposition, after a grid search for the best location. Here is the grid search:
Here is Planck's CMB observation:
and here is the hyperspherical harmonic acoustic spectral simulation of the same:
at Earth's position:
χ= 339.46 degrees
θ = 341.1 degrees
ϕ= 104.08 degrees
More details here:
Here is the Equation of State of the Universe:
Here is the 3D Map of the Observable Universe:
CENSORSHIP
My theory has been published since 2007 and it has been censored at Los Alamos archives and mainstream journals (including the one where Dr. Amendola is the editor)!
You have your voice. You are allowed to publish your work. I am not.
I have a story to tell, one that is distinct from the story you tell and that everyone wants to hear.
Can Scientists handle that? Science should be able to do so.
I would like you to offer to be my endorser at Los Alamos Archives.
Best Regards,
Marco Pereira
PS- Please confirm that your theory failed to predict the SN1a distances and please provide me with its E(z).
When reading my work, please disregard the SDSS data analysis. I retracted that part. I have no problem accepting my mistakes when I recognize them as mistakes.

Similar questions and discussions

How can I get this simple discovery independently verified?
Question
40 answers
  • Marco PereiraMarco Pereira
PROOF OF AN EXTRA SPATIAL DIMENSION
I studied the SDSS BOSS dataset and created a galaxy density map of the current universe.
The details of how to create a galaxy density map of the current universe are not relevant to the discovery. I just created a trivial d(z) and mapped all the 1.3 million objects in the dataset.
Then I created a cross-section of the globe, integrate of one of the angles (e.g. Declination) and just plotted all the other points densities irrespective to their Right Ascension.
This creates profiles that are consistent with the Galaxies being seeded by 36 Density Oscillations. In my theory those are mapped to Neutronium Acoustic Oscillations, but that is irrelevant.
The discovery that I want to be independently verified is that:
  1. In the SDSS BOSS dataset, there is information on the cross-section of the Galaxy Density Map that indicates SPHERICAL SEEDING OF GALAXIES (SSG).
  2. SSG distribution coalesce into 36 clusters (exact number is irrelevant, as long as it is more than 1).
I simplified the request. Just confirm the existence of spherical galaxy density distribution and that the distribution clusters itself into 36 or thereabouts (number is not relevant, just need to be larger than one) profiles.
Basically the request requires you to reproduce the plot and to realize that the mapping d(z) to current hypersphere or epoch is such that in a normalized Radius, distance is equal to alpha.
Also, no matter what mapping one uses, the spherical and clustering natures will not change and will also not depend upon the topology. Say, let d(z) be L-CDM corresponding function. That only changes distance and thus keeps the spherical nature of the distribution. L-CMD d(z) (if they had one) will also not change the clustering pattern since it is only changing distance d. This means that qualitatively spherical nature and clustering are not model dependent.
DATA ANALYSIS
The data, python scripts, and a video to help setting up the Anaconda Environment is provided here:
The creation of the map entails:
  1. Reading FITS files using astropy module
  2. FIXBOSS method which bins angular space to 0.1 degree and normalized radius, x, y, z by rounding them to n=3 significative figures.
  3. Notice that I am not doing anything to the Number density NZ nor to the proximity of objects. Galaxy density is the sum of objects times their NZ within a volume of 0.001 Radius x 0.1 DEC degree x 0.1 RA decree. Notice that radius range is [0,1], DEC [0,360], RA [90,-90]
  4. Notice that I mapped Alpha to distance (radius associated with objects).
The work is published here:
Map of the Universe here:
Is is ethically acceptable for editors to just yank your article from the review process without providing a single reason?
Question
40 answers
  • Marco PereiraMarco Pereira
The offended paper is here:
This is a rhetorical question since, in my mind, that is utterly non-acceptable.
I say that while accepting the reality that it takes time to write a few paragraphs in a rejection letter.
That said, it might take years to polish the arguments contained in a paper.
In my case, it took 16 years.
My issue is that, on purpose, I chose to tackle the Big Bang Theory first. It is the weakest model in the whole Physics. There are "Crisis in Cosmology" articles written by everyone and their cats. There is Hubble Tension, S8 tension... Missing Dark Matter, Early Galaxy Formation Conundrum...
Not to mention the lack of any evidence of a False Vacuum, Inflaton Field or Inflaton Particle, etc, etc.
My theory starts with a new model for matter, where matter is made of shapeshifting deformations of the metric (so, it is not Mass Deforms Metric, but modulated metric is mass).
It cannot be simpler. It allows the Universe to have just space, deformed space and time - the simplest possible model.
Occam's Razor will tell you that this model should be part of the conversation.
The Universe starts from a Heisenberg-Dictated Metric Hyperspherical Fluctuation, which after partial recombination is left with an Inner Dilation Layer (IDL) and the Outermost Contraction Layer (OCL).
As one would expect OCL breaks apart when it starts to move, pushed by the IDL. This process has a physical analogy in the Prince Rupert Drop
SO, the model is disappointly simple. No metrics, nothing for you to polish... just a simple model that explains EVERYTHING.
It also debunks General Relativity (Einstein's equations do not describe the Universe expansion). And replicates all Einstein's successes, while providing simpler explanations (instead of time dilation, we have the weakening of forces with absolute velocity).
What about ABSOLUTE VELOCITY? Well, we all know we can define absolute velocity using the CMB. Period. So, absolute velocity (and the breakdown of Relativity) shouldn't be a surprise.
So, my theory also challenges the current Cosmic Distance Ladder and in doing so (using an epoch-dependent law of Gravitation), it parameterless predicts the distances using just the redshifts. The predictions are attached.
So, in doing so, it attacks Dark Matter and Dark Energy and all the sordid interests behind them. I say sordid in the sense that I believe that all these entrenched interests are at play in this summary rejection of my work.
Why would I say that? There is a simple reason. If an editor (and all the other editors) don't bother to justify their actions, one is left with nothing to do other than speculate on the WHY.
Why is it ok for preprint repositories to block my already published work?? That is happening (and happened during the last 16 years) at the Los Alamos Archives.
Why would it be ethical for an editor not to write a single paragraph pointing to an specific scientific reason for yanking a paper out of the review process?
How calous these people can be with respect to Science and Mankind's Future? Science is the key to the Future. It shouldn't be at the mercy of unconfessable motivations.

Related Publications

Article
Full-text available
We investigate the possible effect of cosmological-constant type dark energy during the inflation period of the early universe. This is accommodated by a new dispersion relation in de Sitter space. The modified inflation model of a minimally-coupled scalar field is still able to yield an observation-compatible scale-invariant primordial spectrum, s...
Article
Full-text available
We present a graviatom with de Sitter interior as a new candidate to atomic dark matter generically related to a vacuum dark energy through its de Sitter vacuum interior. It is a gravitationally bound quantum system consisting of a nucleus represented by a regular primordial black hole (RPBH), its remnant or gravitational vacuum soliton G-lump, and...
Article
We first study dark energy models with a minimally-coupled scalar field and exponential potentials, admitting exact solutions for the cosmological equations: actually, it turns out that for this class of potentials the Einstein field equations exhibit alternative Lagrangians, and are completely integrable and separable (i.e. it is possible to integ...
Got a technical question?
Get high-quality answers from experts.