Article

Zur Elektrodynamik bewegter Körper

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Einstein's theories of Special Relativity (SR) [1] and General Relativity (GR) [2] are built on the foundation of observer dependent simultaneity, where the perception of time and events is shaped by the reception of electromagnetic radiation typically photons as carriers of information. This led to the idea that events only become real once observed through what Einstein termed the "light agency" [3]. ...
... • Absolutivity theory offers an observer independent description of space time that fundamentally differs from the observer-oriented relativity theory developed by Einstein [1,2]. • The model redefines time as a vector, not a scalar. ...
... This conclusion is consistent with the elimination of singularities in recent gravity unification work [10]. • The so called "time dilation" observed in relativity theory is reinterpreted here as clock time dilation a property of local measurement systems subject to motion, temperature, material composition, and gravity, not of time itself [1,6]. This model ultimately proposes a self-consistent, scalable, and predictive structure for space time that integrates causal dynamics, observer independent time, and vector mathematics into a singular framework. ...
Article
Full-text available
This paper introduces a conceptual 4-dimensional space time model that departs fundamentally from Einstein’s relativity. Unlike observer dependent systems, this framework emphasizes true simultaneity, distinguishing objective reality from perceived events. It defines a 4D orthogonal vector coordinate system, combining 3D Cartesian space with a fourth dimension of virtual time surfaces, which represent instantaneous temporal slices across space. These time surfaces are curved within an otherwise flat 3D space, forming a dynamic, evolving “present” enclosed at the boundary of the space time system. Time is modeled as an independent scalar magnitude, making it fully orthogonal to spatial dimensions and immune to external influences like gravity. This redefines the role of time as a pure sequence of events, without the curvature or distortion proposed in general relativity. By projecting 3D space onto this new virtual topology, the model offers a unique geometric view of space time. It challenges conventional gravitational space time interactions and repositions time as an unlinked, autonomous coordinate within a unified but orthogonal framework.
... • The electromagnetic force, responsible for all electromagnetic interactions (de Coulomb 1785; Ørsted 1820; Ampère 1827; Faraday 1831; Maxwell 1873). Notably, according to Faraday's law (Faraday 1831), the magnetic field arises from a moving electric field, and it is well-established that the magnetic field is an effect of the special relativity of the electric field (Maxwell 1865;Einstein 1905a;Feynman et al. 1964 -or the most recent New Millennium Edition of 2011-, Landau & Lifshitz 1971;French 1971;Schwinger et al. 1973;Heald & Marion 1987;Jackson 1999a); ...
... However, at the dawn of the 20th century, the wave model faced a profound challenge. In 1905, Albert Einstein, building on Max Planck's work on blackbody radiation (Planck 1900(Planck , 1901, explained the photoelectric effect by proposing that light also behaves as discrete packets of energy, later called photons (Einstein 1905a). This quantization of light marked the birth of quantum physics, revealing that light could exhibit both wave-like and particle-like properties, depending on the experiment. ...
... The conical topology, moving forward with a translational velocity equal to the speed of light (vt = c) and simultaneously rotating with a rotational velocity also equal to the speed of light (vr = c), does not violate any physical principles. Indeed, the combination of these two velocities is not a classical vector sum but follows the well-established relativistic velocity composition rule (Einstein 1905a), given by: ...
Preprint
Full-text available
Understanding the fundamental interactions governing our universe has long been a central challenge in theoretical physics. In this work, we introduce the AETHER framework (Approach to Explaining Topological Harmonization in Empirical Research), a unified topological model that establishes deep connections between electromagnetism, gravity, and quantum phenomena. By extending Maxwell's equations to incorporate gravitational effects, we derive a novel set of equations that seamlessly link gravitational and electromagnetic fields without the need for additional dimensions. Our model proposes that fundamental particles, including photons and electrons, arise from specific topological deformations of space-time, providing a reconciliation of wave-particle duality and the emergence of mass. We revisit the structure of the photon, suggesting a dual nature as both an energy carrier and a fundamental topological excitation. This leads to new insights into light propagation, pair production, and charge quantization. Expanding on this, we explore the electron-positron system's role in shaping atomic structure. Furthermore, we analyze how the Standard Model particles fit within this topological framework, offering potential explanations for neutrino behavior, dark matter, and the fine-structure constant. Finally, we discuss the broader implications of AETHER for cosmology, suggesting that the expansion of the universe and large-scale structure may stem from the intrinsic self-interaction of a four-dimensional hyperspherical topology. This perspective provides testable predictions, particularly in regard to interference phenomena and the charge residuals in dark matter candidates, thus paving the way for experimental verification.
... By its own description, it is created and edited by volunteers around the world and hosted by the Wikipedia Foundation. In this work attention is centered on its treatment of Einstein's theory of relativity which was published in 1905 [1]. The centerpiece of his theory is the Lorentz transformation (LT) which was introduced by Larmor [2], and Lorentz [3], at the end of the 19 th century. ...
... Since Wikipedia accepts the viability of the LT, it necessarily adheres to a number of false conclusions that follow directly from it. To begin with, consider Einstein's view that time dilation is symmetric [1]. This is illustrated by the application of Galilea's Relativity Principle (RP). ...
... They found that the measured frequency was higher than that of the emitted radiation. In other words, they observed a blue shift, in clear violation of Einstein's Symmetry Principle [1] described above. Virtually the same experiment was carried out by Kündig [14], and Champeney et al., [15], with very similar results. ...
Article
There is a litany of fundamental problems with Einstein’s theory of relativity. The Lorentz transformation (LT), which the centerpiece of his theory, is not internally consistent, thereby proving that is not valid. It claims that lengths can contract while times increase without changing the value of its ratio, the speed of light. It argues in favor of the symmetric character of time dilation, even though the experiments with circumnavigating clocks carried out in 1970 by Hafele and Keating contradict this claim. It also claims that events which are simultaneous for one observer may not be so for another, even though the Global Positioning system relies on the assumption that the emission of light pulses occurs at exactly the same time for both the atomic clock on an orbiting satellite and its counterpart of the Earth’s surface. Wikipedia supports Einstein’s theory at every turn. The Law of Causality, which Newton relied upon in formulating his First Law of Kinetics, is never mentioned by Wikipedia in the context of Einstein’s theory. The fact that an alternative (Newton-Voigt) space-time transformation has been introduced which is consistent with the Law of Causality and Galileo’s Relativity Principle (RP) is totally ignored by Wikipedia. The Uniform Scaling Method of physical properties is ignored as well, despite its usefulness in comparisons of the results of experiments from the vantage points of different rest frames. All of these inadequacies of Wikipedia are discussed in detail below.
... Today, several theories enjoy the status of paradigms, among them mechanics (such as Newtonian, Lagrangian, Hamiltonian and statistical mechanics [1][2][3][4][5]), special relativity (SR) [6,7], general relativity (GR) [8,9], the probability approach to quantum mechanics (QM) known as the Copenhagen interpretation [10,11], quantum field theories (QFT) [12,13], and the Higgs field theory [14]. Other theories, such quantum gravity [15][16][17], relativistic thermodynamics [18][19][20] and quantum thermodynamics [21][22][23][24], attempt to reconcile conflicting theories. ...
... Velocity as a kinematic quantity, Newton's concepts of an acting force and of mass as quantity of matter [1], Einstein's equivalence of energy and mass [7,25,26], and Mach's and Einstein's kinematic idea of relativity [6,27] were crucial for the development of physics. On this basis, theoretical physics describes interaction in terms of force fields or uses relational approaches. ...
... The process-energy principle (PEP) in Eq. (6) suggests that the kinematic definition of the velocity v obscures the correct dynamic content of v and its role in a process. ...
Article
Full-text available
This series presents the advantages of momentum work, a form of work that is missing in classical and quantum mechanics. In mechanics, forces are assumed to mediate interactions and actively change the properties of objects—an idea that is based on the kinematic concept of velocity and has led to indeterminacy at the quantum level and to spacetime theories at the macrolevel, according to which space and time are relative. A different picture emerges when only processes with energy transfer, including a form of work that changes the momentum of objects, can change the properties of objects, even at the quantum level. The aim of this series is to show that the priority of processes over forces leads to a vivid and deterministic description of the temporal behavior of matter in Euclidean space. In this first paper of the series, we address the conceptual content of basic variables, with a focus on velocity. When velocity is treated as a dynamic state variable of a material object, rather than a kinematic quantity, absolute time and absolute simultaneity are an imperative logical consequence.
... This process of transforming an internal representation into an external one is generally referred to as "externalization". However, Strasser (2010) emphasizes that the term does not reflect what really occurs, as it is the use of an external representation to refer to a mental representation. This process can occur through speech, writing, or drawing, for example, in situations where the individual seeks and is able to express their knowledge about something adequately. ...
... Therefore, identifying mental representations (through "externalizations") is highly useful for improving the quality of teaching and learning. Spector (2010) emphasizes that, considering that internal representations are fundamental for problem-solving, the development of forms to assess external representations of these mental structures is critical for supporting effective learning. ...
... For example, considering the Special Relativity Theory (SRT). The "scholarly knowledge" includes the original Einstein's paper, "On the electrodynamics of moving bodies" with the two postulates and mathematical formulation (Einstein, 1905). This knowledge is adapted for teaching materials, focusing on the main consequences and simplified formulas, using analogies and highlighting its practical and technological applications, being transformed into "knowledge to be taught". ...
Thesis
Full-text available
This doctoral thesis investigates the mental representations and conceptions of high school students about Relativity Theory, focusing on the General Relativity (GR). The research, grounded in the Cognitive Mediation Networks Theory (CMNT), explores how different levels of external mediation influence the development of mental representations and students’ understanding of complex and counterintuitive concepts, such as curved spacetime and gravitational time dilation. The research consists of three experiments: Cohort 1, derived from the master’s research, analyses students’ mental representations of Special Relativity, serving as a basis for the subsequent experiments. Cohort 2 and 3, conducted in the format of extracurricular mini courses, investigate how students construct mental representations of GR concepts. The data collected through pre-tests, post-tests, interviews, gesture analysis, and student-produced materials reveal a strong connection between the consistency of mental representations and conceptual understanding. Students with well-defined pictorial mental representations, often developed through interactions with computer simulations and the rubber-sheet model, demonstrated better understanding of curved spacetime. On the other hand, gravitational time dilation was more frequently understood through propositional representations, formed through classroom discussions and exercises. The research shows the influence of different levels of external mediation on the construction of drivers, which manifest as mental representations. Computer simulations, the rubber-sheet model, group discussions, teacher explanations, cultural references such as the movie Interstellar, and Generative Artificial Intelligence tools contributed variably to students’ understanding. The conclusion is that a multimodal approach, recognizing the value of diverse resources working collaboratively, is essential for effective learning of GR. The CMNT proves to be an effective theoretical reference for analysing and understanding the dynamic interaction of external mediations in the formation of internal representations, providing valuable insights for curriculum planning and pedagogical practices, and the integration of AI tools in physics education.
... Both effects represent different aspects of how entanglement networks constrain and regulate information flow through spacetime. This unified perspective resolves the seemingly arbitrary nature of the light speed limit, revealing it as a natural consequence of the same entanglement dynamics that produce gravity [5]. ...
... Special relativity fundamentally altered our understanding of space and time by showing they are not absolute but relative to the observer's reference frame [5]. While conventionally interpreted through geometric arguments about spacetime, our entanglement-based framework offers a deeper explanation for why special relativity works as it does. ...
Preprint
Full-text available
This paper extends our entanglement-based theory of gravity into a unified framework for relativity, causality, and quantum mechanics. We demonstrate mathematically that gravitational time dilation and an effective variable speed of light emerge naturally from variations in entanglement entropy density. Reformulating the Einstein field equations, we show that curvature arises from entanglement gradients, leading to a natural explanation for gravitational coupling constants. From a photon's perspective all journeys are instantaneous, which provides a conceptual bridge between relativistic time dilation and quantum non-locality, and suggests that spatial distance is emergent rather than fundamental. We propose that entanglement dropping below maximal value the other side of black hole event horizons "allows" spacetime to emerge, and consider Hawking radiation to be quantum tunneling across this entanglement boundary. We derive a precise relationship between black hole mass, Hawking radiation, and inflationary dynamics in the universe nested within, and thereby a quantitative basis for the asymptotic meta-temporal structure proposed in our previous work. Our formalism ultimately demonstrates that entanglement could be the substrate of reality itself, and posits a cycle of cosmic expansion modifying entanglement density, then in turn determining gravitational effects and causal structure. This unification of quantum mechanics and relativity through a shared informational foundation provides testable predictions for cosmology, black hole physics, and quantum gravity.
... Cinco dessas equivalências já são conhecidas. [1, fórmula (123)] Com base na conclusão de uma construção dedutiva baseada no método axiomático [2] e nas cinco equivalências conhecidas de grandezas físicas, foi feita uma conclusão sobre a equivalência de todas as grandezas físicas. [1] Uma imagem das equivalências de grandezas físicas é apresentada na forma de sequências de fórmulas dessas equivalências e uma sequência de coeficientes dessas equivalências, o que, por sua vez, tornou possível descrever as equivalências de constantes físicas com a expressão de algumas constantes por meio de outras na forma de sequências de fórmulas. ...
... Albert Einstein introduziu pela primeira vez a equivalência de massa e energia. [2] 1921. Wolfgang Pauli introduziu uma extensão da equivalência de Einstein (e vice-versa) na forma de equivalência energia-massa. ...
Preprint
Full-text available
O princípio da equivalência de grandezas físicas foi demonstrado nas equivalências das grandezas físicas básicas: energia, espaço, tempo, carga elétrica, fluxo magnético e massa, bem como das elétricas: capacitância e indutância. São apresentadas as séries de grandezas físicas e as equivalências de cada uma dessas grandezas, bem como os coeficientes dessas equivalências. Sequências de equivalências de grandezas físicas expandirão nossa compreensão da física da natureza: equivalências de energia, massa, espaço e outras grandezas físicas indicam a natureza da energia escura, matéria escura e espaço escura, bem como equivalências de constantes físicas. Equivalências de constantes físicas permitem calcular algumas constantes físicas por meio de outras constantes, e é possível aumentar a precisão de algumas constantes em detrimento de outras que são mais precisas (para condições de repouso).
... Since the age of classical Greek philosophy, some paradoxes have been arising regarding physics and motion. On the one hand, one may refer to wave-particle duality (Huygens, 1690;Newton, 1704), which even firstly swayed to the second due to Newton's prominence and Poisson's positivist perspective, it was finally proven and set that paradox (see Poisson spot: Arago & Fresnel, 1818;Poisson, 1818); to the infamous Einstein-Podolsky-Rosen (EPR) paradox (Einstein, Podolsky, & Rosen, 1935); or to the irreconcilability of Einstein (1915;1916) general relativity (GR) and quantum mechanics (QM) theory (Aspect, Dalibard, & Rogers, 1982;Bell, 1976;Bell et al., 1985;Davisson & Germer, 1927;de Broglie, 1923;Einstein, 1905;Heisenberg, 1927;Planck, 1901ab;Schrödinger, 1926;1935a;1935b;Wheeler, 1978; Young, 1804; Zeilinger, 1986), irreconcilability that has been already pointed out (Ehrenfest, 1909;Grøn, 1979;Kumar, 2024), or, distinguishing it in this paper, Zeno's paradoxes, especially Zeno's paradox of motion regarding Achilles and the continuum problem that was tackled by his master Parmenides (see Chen, 2021;Gale, 1968;Grünbaum, 1968;1973, to expand into the specific paradox). ...
... For the influential ideas of Greek philosophy and the counterintuitive yet proven physical properties of QM (Aspect et al., 1982;2022;Bell et al., 1985;Einstein, 1905; Zeilinger, 1986), one may introduce this section, devoted to a problem (a related problem to the purpose of this paper) referred as "logical atomism" (see Democritus, 450 B.C.E., and Russell, 1924), as well to an effort to review research on physics, even though the math sometimes can be harsh, and sometimes the understanding of them may seem rough, just to take a sense on the inconsistencies and contradictions, as well as avoiding to miss experimental and empirical research to track their own research. Especially, one may highlight Prigogine's works (Prigogine, 1977;1989 arguments and fruitful debate (Bache, 2023ab; Hamada, 2022) it sets a limit to length ( p), at least concerning fermions, and it may also set a limit to length regarding space (Mead, 1964, p. B857). ...
Preprint
Full-text available
In this paper, a solution to Zeno's paradox of motion was offered and a possible based on observations and empirical viewpoint interpretation. A gedanken experiment was proposed in accordance to our empirical and experimental paradigm, offering solutions from pendular energy and geometric gravity theory, the standard view, and classical physics, as a reply to Zeno's paradox (of motion), departing from the logical and overly mathematical frameworks, which either offered a solution detached from experiment and observations, the former, and a solution with neither explanation nor interpretation, the latter. It takes on the problem of unmeasurable regions, Zeno's paradoxes (measure and motion), and infinitesimals, offering some insights that could be found interesting for both, metaphysical and foundations of physics fields.
... The diffusion phenomenon has been studied for over two centuries but gained significant momentum after Einstein's landmark paper in 1905. 1,2 The diffusing species, i.e., the tracer, is assumed to perform Brownian motion (BM), which is described by the Einstein-Smoluchowski equation. Fickianity, time averaged-mean squared displacement (TA-MSD), constancy in the values of diffusion coefficients, stationarity, ergodicity, and Gaussianity are the main features of BM. [3][4][5][6] The conditions assumed for deriving Fickian diffusion parameters are as follows: (i) the tracer is an independent individual, indicating independence with respect to time and distance, (ii) the concentration profile or the probability distribution function varies symmetrically in both positive and negative directions, and (iii) the tracer does not chemically interact with the diffusion medium. ...
... This indirectly pointed to the non-Fickian or anomalous diffusion of the dye in the agar gel. We then evaluated the anomalous diffusion coefficients (K a , cm 2 s Àa ) using eqn (2) from the trajectory data, and then modelled them into the power law fit (Fig. 4). The classical diffusion coefficient, D A , and the anomalous diffusion coefficient, K a are very different from each other and should not be compared. ...
Article
The transport of material, particularly crystal violet dye, in the heterogeneous environment of agar gel does not adhere to Fick's law; rather, it exhibits anomalous behavior that is influenced by the tracer's concentration.
... In 1905 [5], Albert Einstein proposed his theory of Special Relativity (SR), to explain the constancy of the speed of light required by Maxwell's field equations and remove the unwanted ether. This introduced an apparent contradiction with Newton's theory of gravitation [6]. ...
... The boundary observable in this case is the entanglement entropy S A (l) in an infinitely long one dimensional system where A is an interval of length l. This was combined with bulk-cone singularities in null boundary probes to show the one can extract both 5 In this thesis, we restrict our analysis to static spacetimes where ∂ t is Killing. However one can extend this relationship to time-dependent cases such that the area/entropy formula is fully covariant [63]. ...
Preprint
Full-text available
Motivated by the holographic principle, within the context of the AdS/CFT Correspondence in the large t'Hooft limit, we investigate how the geometry of certain highly symmetric bulk spacetimes can be recovered given information of physical quantities in the dual boundary CFT. In particular, we use the existence of bulk-cone singularities (relating the location of operator insertion points of singular boundary correlation functions to the endpoints of boundary-to boundary null geodesics in the bulk spacetime) and the holographic entanglement entropy proposal (relating the entanglement entropy of certain subsystems on the boundary to the area of static minimal surfaces) to recover the bulk metric. Using null and zero-energy spacelike boundary-to-boundary geodesic probes, we show that for classes of static, spherically symmetric, asymptotically AdS spacetimes, one can find analytic expressions for extracting the metric functions given boundary data. We find that if the spacetime admits null circular orbits, the bulk geometry can only be recovered from the boundary, down to the radius of null circular orbits. We illustrate this for various analytic and numerical boundary functions of endpoint separation of null and spacelike geodesics. We then extend our analysis to higher dimensional minimal surface probes within a class of static, planar symmetric, asymptotically AdS spacetimes. We again find analytic and perturbative expressions for the metric function in terms of the entanglement entropy of straight belt and circular disk subsystems of the boundary theory respectively. Finally, we discuss how such extractions can be generalised.
... 4,33 Radiation pressure can be explained and characterized in the three canonical pictures of light: ray-optics, wave-optics and photon-optics. In all pictures, Einstein's special relativity 34 predicts that light carries energy E and thus momentum p through the energy-momentum relation E = pc. In the quantum picture, each photon carries momentum ℏω/c. ...
... The change in photon energy between reference frames is the well-known relativistic Doppler shift, but is just one of two phenomena arising from the Lorentz transformation between two reference frames. The second phenomenon is the lesser-known relativistic aberration, 34 whereby photon propagation directions change with an object's motion relative to a light source. Altogether, the Lorentz transformation between laser source and sail reference frames transforms both the temporal (Doppler effect) and spatial (relativistic aberration) components of photon fourmomenta, and both should be included in lightsail modeling. ...
Preprint
Full-text available
Lightsails are a highly promising spacecraft concept that has attracted interest in recent years due to its potential to travel at near-relativistic speeds. Such speeds, which current conventional crafts cannot reach, offer tantalizing opportunities to probe nearby stellar systems within a human lifetime. Recent advancements in photonics and metamaterials have created novel solutions to challenges in propulsion and stability facing lightsail missions. This review introduces the physical principles underpinning lightsail spacecrafts and discusses how photonics coupled with inverse design substantially enhance lightsail performance compared to plain reflectors. These developments pave the way through a previously inaccessible frontier of space exploration.
... Albert Einstein had first developed the special theory of relativity (Einstein, 1905). It is true that Einstein himself was not a logician. ...
... Albert Einstein hatte zuerst die heute sogenannte spezielle Relavitätstheorie aufgestellt (Einstein, 1905). Zwar war Einstein selbst kein Logiker. ...
Book
Kurt Gödel was not just a brilliant logician but contributed also important results to the general theory of relativity. In the 1940s he found a solution to Einstein's field equations which (roughly speaking) allows for time-travel, i.e. compromises our very notion of time. This book brings together the contributions to the Kurt Gödel essay competition on the question: "What does it mean for our world view if, according to Gödel, we also assume the non-existence of time?” Kurt Gödel war nicht nur ein brillanter Logiker, sondern trug auch wichtige Ergebnisse zur allgemeinen Relativitätstheorie bei. In den 1940er Jahren fand er eine Lösung für die Einsteinschen Feldgleichungen, die (grob gesagt) Zeitreisen ermöglicht, d.h. unsere Vorstellung von Zeit in Frage stellt. Dieses Buch versammelt die Beiträge zum Kurt-Gödel-Aufsatzwettbewerb zu der Frage: "Was bedeutet es für unser Weltbild, wenn wir mit Gödel die Nichtexistenz der Zeit annehmen?"
... The Euclidean Model of Space and Time (EMST) is a parallel theory to Special Relativity (SR) [1]. It is based on the assumption that physical space is four-dimensional space with Euclidean metrics. ...
... The quantity u is the velocity at which system S ′ moves relative to system S. The formula is identical to the formula derived by Einstein in the SR framework, see [1]. The Formula 14 derived earlier is a special case of Equation 15 for φ = π/2. ...
Article
Full-text available
The aim of the paper is to show the fundamental advantage of the Euclidean Model of Space and Time (EMST) over Special Relativity (SR) in the field of wave description of matter. The EMST offers a unified description of all particles of matter as waves moving through four-dimensional Euclidean space at the speed of light. Unlike the usual description in three dimensions, where the group and phase velocities of a particle differ, in four-dimensional space the wave and the associated particle can be treated as a single object. The paper deepens understanding of the EMST as a viable alternative to SR. The EMST clarifies the origin of relativistic phenomena and at the same time explains the apparent mysteries associated with the wave nature of matter. From the broader perspective, the EMST has all the prerequisites to become the starting point for the mutual combination of “relativistic” and “quantum” physics into a single physical theory.
... Let us assume an atom A with mass m and an energy of E 0 = m c 2 0 in the ground state located at the gravitational potential U 0 = 0. With an energy difference ∆E 0 from the ground state to the excited atom A * , the mass in this state is: Einstein (1905a);von Laue (1920von Laue ( , 1911. The rest energy of an excited atom at a distance r from the centre of mass M is, cf. ...
... yEinstein. 2 ν is the frequency of the light.3 The Planck constant (exact): h = 6.626 070 15 × 10 −34 J Hz −1 . 4 The speed of light in vacuum (exact): c0 = 299 792 458 m s −1 . We write c0 without gravitational fields, otherwise c. 5 These and other physical constants are taken from CODATA Internationally recommended 2022 values, cf. BIPM (2019).6Einstein (1905a) used the expressions ,,Energiequanten" (energy quanta) and ,,Lichtquant" (light quantum). The name "photon" was later coined byLewis (1926). ...
Preprint
Ortiz and Ibarra-Castor (2024) have presented a "Generalized redshift formula" taking account of only energy conservation considerations. Contrary to their claim, we emphasize to invoke both energy and momentum considerations in order to deduce all three types of redshift (Doppler, gravitational and cosmological). We formulate our views on the three physical effects in a consistent manner in addition to addressing the lack of relevant references in Ref.(Ortizand Ibarra-Castor, 2024).
... The hardest thing to understand in the Theory of Relativity is why so many people want to believe it is wrong. The Lorentz transformation is a cornerstone of Special Relativity [1], describing how time and space coordinates change between inertial frames in relative motion. Despite its well-established foundation, some antirelativist critics argue that the Lorentz transformation, and therefore both Special and General Relativity, are flawed. ...
Preprint
Full-text available
When faced with overwhelming evidence supporting the reality of time dilation, confirmed in particular by the Hafele-Keating experiment, some anti-relativists reluctantly concede that time dilation applies to light clocks. However, they argue that the Theory of Relativity remains flawed, claiming that time dilation applies to light clocks only, not to massive objects. They assert that atomic clocks, which operate based on microwave radiation, merely create the illusion that the Hafele-Keating experiment confirms the theory. To refute this misconception, we introduce a thought experiment inspired by Schrödinger's cat, in which the fate of Einstein's cat depends on a "Sync-or-Die clock", an imaginary device that tests the synchronization between a light clock and a mechanical clock, potentially triggering the release of poison. By analyzing this scenario from both the inertial frame where the device is at rest and another in which it moves at constant velocity, we demonstrate that time dilation must apply to the mechanical clock in exactly the same way as it does to the light clock, highlighting the universality of relativistic time dilation.
... After Maxwell's equations and the Michelson-Morley experiment [2], Henri Poincare proposed that all interactions including gravity should be relativistic [3]. In the same year, Einstein published a paper which established postulates of special relativity [4]. He brought the General Theory of Relativity as the most extraordinary achievement in modern physics 10 years later [5]. ...
Preprint
We consider a model where the Standard Model is added to the Einstein Lagrangian together with a Jordan-Brans-Dicke(JBD) coupling. The time-dependent Higgs field has an important role in interpreting the effective gravitational constant, GeffG_{eff}. This may lead to two Big Bangs, the first Big Bang characterizes the size of the universe being zero. At this Big Bang, the value of the effective gravitational constant is zero and starts decreasing in time through negative values. During this era, the JBD term is important. In the second Big Bang, the effective gravitational constant passes through infinity to positive values. The negative gravitational constant is interpreted as repulsive gravity. The Lagrangian density provides effective potentials leading to spontaneous symmetry breaking which gives cosmological expectation value of the Higgs field and the Higgs mass which depends on curvature and the Brans Dicke parameter.
... High-harmonics obtained from relativistic oscillating mirrors [16,27,[142][143][144] enable breaking through this limitation, as the plasma electrons conducting laser-driven oscillations are already ionized. Fig. 8-a shows a coherent attosecond pulse (white and orange colors depicts positive and negative parts of the magnetic field) produced from a relativistic plasma mirror (blue to yellow density scale) moving with constant velocity [145], which can be driven either by charged particle beams [146] or intense laser pulses [25,147]. ...
Preprint
We present the Virtual Beamline (VBL) application, an interactive web-based platform for visualizing high-intensity laser-matter simulations (and experimental data in the future). Developed at ELI Beamlines facility, VBL integrates a custom-built WebGL engine with WebXR-based Virtual Reality (VR) support, allowing users to explore complex plasma dynamics in non-VR mode on a computer screen or in fully immersive VR mode using a head-mounted display. The application runs directly in a standard web browser, ensuring broad accessibility. VBL enhances the visualization of particle-in-cell simulations by efficiently processing and rendering four main data types: point particles, 1D lines, 2D textures, and 3D volumes. By utilizing interactive 3D visualization, it overcomes the limitations of traditional 2D representations, offering enhanced spatial understanding and real-time manipulation of visualization parameters such as time steps, data layers, colormaps. The user can interactively explore the visualized data by moving their body or using a controller for navigation, zooming, and rotation. These interactive capabilities improve data exploration and interpretation, making the platform valuable for both scientific analysis and educational outreach. We demonstrate the application of VBL in visualizing various high-intensity laser-matter interaction scenarios, including ion acceleration, electron acceleration, γ\gamma-flash generation, electron-positron pair production, attosecond and spiral pulse generation. The visualizations are hosted online and freely accessible on our server. These studies highlight VBL's ability to provide an intuitive and dynamic approach to exploring large-scale simulation datasets, enhancing research capabilities and knowledge dissemination in high-intensity laser-matter physics.
... Going one step further, metric learning can be embedded in a more general framework called Riemannian metric learning [143], in the same way that Galilean relativity is embedded by General relativity [57]. In Riemannian metric learning, the input space (X , g) is a Riemannian manifold, i.e. a kind of smooth geometric shape (see Figure 2) with a Riemannian metric g: X → R d ×d that allows to locally measure lengths and angles via a matrix field. ...
Preprint
Full-text available
Riemannian metric learning is an emerging field in machine learning, unlocking new ways to encode complex data structures beyond traditional distance metric learning. While classical approaches rely on global distances in Euclidean space, they often fall short in capturing intrinsic data geometry. Enter Riemannian metric learning: a powerful generalization that leverages differential geometry to model the data according to their underlying Riemannian manifold. This approach has demonstrated remarkable success across diverse domains, from causal inference and optimal transport to generative modeling and representation learning. In this review, we bridge the gap between classical metric learning and Riemannian geometry, providing a structured and accessible overview of key methods, applications, and recent advances. We argue that Riemannian metric learning is not merely a technical refinement but a fundamental shift in how we think about data representations. Thus, this review should serve as a valuable resource for researchers and practitioners interested in exploring Riemannian metric learning and convince them that it is closer to them than they might imagine-both in theory and in practice.
... However, a closer inspection of both theoretical predictions and more precise experiments caused cracks in this picture, vividly discussed to this day. On one hand, properties of electromagnetic waves gave rise to special relativity (SR) [14], which reduced the distinction between the space and time dimensions. Later, general relativity (GR) was introduced, defying the absoluteness of spacetime, as it becomes curved in the presence of, now equaled, energy-matter. ...
Article
Full-text available
The measurement problem in quantum mechanics (QM) is related to the inability to include learning about the properties of a quantum system by an agent in the formalism of quantum theory. It includes questions about the physical processes behind the measurement, uniqueness, and randomness of obtained outcomes and an ontic or epistemic role of the state. These issues have triggered various interpretations of quantum theory. They vary from refusing any connection between physical reality and a measurement process to insisting that a collapse of the wave-function is real and possibly involves consciousness. On the other hand, the actual mechanism of a measurement is not extensively discussed in these interpretations. This essay attempts to investigate the quantum measurement problem from the position of the scientific consensus. We begin with a short overview of the development of sensing in living organisms. This is performed for the purpose of stressing the relation between reality and our experience. We then briefly present different approaches to the measurement problem in chosen interpretations. We then state three philosophical assumptions for further consideration and present a decomposition of the measurement act into four stages: transformation, conversion, amplification and broadcasting, and, finally, perception. Each of these stages provides an intuition about the physical processes contributing to it. These conclusions are then used in a discussion about, e.g., objectivity, the implausibility of reversing a measurement, or the epistemic status of the wave-function. Finally, we argue that those in favor of some of the most popular interpretations can find an overlap between their beliefs and the consequences of considerations presented here.
... Example 1. For instance, Einstein's equation E = M C 2 [5], is a well-defined DK in the field of physics. However, knowledge does not always have such an explicit mathematical expression. ...
Preprint
Full-text available
The exponential growth of scientific publications has made the exploration and comparative analysis of scientific literature increasingly complex and difficult.For instance, eliciting two scientific publications that diverge on widely accepted concepts within their domain turns out to be more and more difficult despite its great interest.We are interested in the automatic detection of these discrepancies using the latest artificial intelligence (AI) techniques. Given a particular scientific domain, we focus on large-scale analysis of the tables present in related scientific publications and propose to capture domain knowledge with arbitrary functions.In this setting, we propose a five-step method, called CCASL: (1) Modeling the domain knowledge with functions expressed as approximate functional dependencies (FDs), (2) Acquiring a corpus of scientific documents related to the proposed functions, (3) Analysing all tables occurring in the PDF documents and producing a consolidated table from them, (4) Detecting counterexamples of the FDs in the consolidated table, and (5) Conducting a comparative analysis of the pairs of papers containing the detected counterexamples. We have applied CCASL to a subfield of polymer research, known as Epoxy-Amine networks (EA). In collaboration with material scientists, we have identified an intuitive function fEAf_{EA} that relates the storage modulus (SM), the structure of the polymer (VEA)(V_{EA}), and its glass transition temperature (Tg)(T_g). Based on this function, we have implemented all the 5 steps of CCASL. First results show that CCASL is proving to be a powerful approach for bibliographic confrontation in the field of polymers.
... In his time at the patent office, Einstein had already published a series of trail blazing papers. Besides his "revolutionary" light-quantum hypothesis [4] and the subsequent papers on energy quanta, he had developed stochastic Brownian motion, the fundamentals of special relativity [5], the derivation of E = mc 2 [6], and even an ansatz for generalizing relativity to uniformly accelerated systems [7, sect. V]. ...
Article
Full-text available
This is the first part of a pair of companion papers devoted to an analysis of Einstein’s fusion picture for light-quanta (subsequently also referred to as “photons”) with classical electromagnetic fields that he formulated in 1909 via a purely verbal description. There, the overlap of many of the hypothetical local force fields, surrounding the photonic energy points, should produce the well–known classical Maxwell fields. We claim that, alone from such a logical connection, that apparently did not receive sufficient attention in the literature, one may draw essential conclusions on the photon states. In the present paper, we give a historical overview of Einstein’s photon conceptions until 1927. We extract hints from his formulations, that lead from his local force fields to Schrödinger wave functions, which reproduce the conserved integrals for energy, momentum, and angular momentum of Maxwell theory in form of Hilbert space expectations. Also a preliminary multi–photon theory, including Glauber states, is shown to be motivated by Einstein’s composite photons with possibly (incoherently!) overlapping force fields. In the resulting Fock space theory, gained by symmetrizing the many–photon states the goal of a fusion theory is, however, not realizable. In the subsequent companion paper “On sectorial photon theories reconstructing Einstein’s early fusion picture for light” (referred to as Rieckers 2025) our conclusions are extended to photon states that surpass Fock space theory by describing actual infinitely many photons. These are shown to cover also classical light states in terms of collective structures. Appealing to the more recent mathematical tools of convex state space approach and algebraic quantum field theory, we suggest there mathematical realizations of Einstein’s fusion program.
... Since the intensity of the driving laser is relativistic (I>1.38 × 10 18 W/cm 2 for a 800 nm laser), the electron sheet will oscillate along the normal direction of the surface with a speed close to the light speed c in vacuum. This gives rise to the emission of extreme ultraviolet (EUV) or soft x-ray photons in the specular direction, due to the relativistic Doppler effect [6][7][8]. As these emissions occur periodically, their spectrum exhibits a high harmonic structure, similar to the gas high harmonic generation [9][10][11]. SHHG is characterized by a high conversion efficiency and excellent spatiotemporal coherence [1,12]. ...
Article
Full-text available
Plasma surface high-order harmonics generation (SHHG) driven by intense laser pulses on plasma targets enables a high-quality extreme ultraviolet (EUV) source with high pulse energy and outstanding spatiotemporal coherence. Optimizing the performance of SHHG is important for its applications in single-shot imaging and EUV absorption spectroscopy. In this work, we demonstrate the optimization of laser-driven SHHG by an improved Bayesian strategy. A traditional Bayesian algorithm is first employed to optimize the SHHG intensity in a two-dimensional parameter space. Then, an improved Bayesian strategy, using the Latin hypercube sampling technique and a dynamic acquisition strategy, is developed to overcome the curse of dimensionality and the risk of local optima in a high-dimensional optimization. The improved Bayesian optimization approach is efficient and robust in convergent to a stable condition in multi-dimensionally optimizing the harmonic ellipticity and intensity. This paves the way for generating a high-quality coherent EUV source with a high repetition rate and promoting further EUV applications.
... They exhibit a wide range of unique properties, including reflection and transmission Doppler shifting, magnetless nonreciprocity, motion-induced bianisotropy, "Fizeau dragging" or space-time weighted averaging, oblique crystal gaps and superluminality [32,33]. Rooted in the theory of special relativity [39,40] and enhanced by novel computational tools [41,42], these properties enable groundbreaking applications such as magnetless isolators [43], space-time crystals [44], reflectionless parametric amplification [45], space-time Fresnel prisms [46], enhanced light squeezing [47] and space-time wedges [48]. Figure 2 illustrates the space-time wedge application, with Figs. ...
Article
Full-text available
We introduce space-time metamaterials as the natural evolution of time-varying metamaterials, highlighting their enhanced properties and potential advantages. These metamaterials offer virtually limitless diversity, driven by their dynamic levels, velocity regimes, and space-time architectures. Notably, they unlock extensive possibilities for transition engineering—the precise control of classical and quantum state transitions through tuning modulation velocity, potential, or dispersion.
... The problem of a moving dielectric has a fundamental appeal because it gave rise to special relativity [10][11][12]. Progress in this field was steady and impressive: the Fresnel-Fizeau drag, the Doppler effect(s), the relativistic Snell-Descartes law, Cherenkov radiation [13], light amplification via moving mirrors [14], etc. Quantization of the electromagnetic field in the presence of moving dielectric media [15][16][17] more recent results; e.g., the quantum friction phenomenon [17][18][19]. ...
Article
Full-text available
Detection of scattered light can determine the susceptibility of dielectrics. Such imaging normally holds Rayleigh's limit: details finer than the wavelength of the incident light cannot be determined from the far-field zone. We show that time -modulation of an inhomogeneous dielectric can be used to determine its susceptibility. To this end, we focus on the inverse quantum optics problem for spatially and temporally modulated metamaterial, whose dielectric susceptibility is similar to moving dielectrics. We show that the vacuum contribution to the photodetection signal is nonzero due to the negative frequencies even in the far-field zone. Hence, certain dielectric features can be determined without radiating any incident field on the dielectric. When the incident light is scattered (or reradiated), the determination of dielectric susceptibility is enhanced and overcomes Rayleigh's limit in the far-field zone. We study similar effects for an inhomogeneous dielectric moving with a constant speed, a problem we consider within relativistic optics. Now the vacuum contribution to the photodetection signal reflects dielectric features, can be long-range in space but is not a far-field effect. Published by the American Physical Society 2025
... • Einstein hypothesized that the speed of light might be anisotropic in space [21], which could influence the behavior of these systems. ...
Preprint
Full-text available
The speed of light is widely regarded as the ultimate limit for information transmission, a cornerstone of modern physics enshrined in special relativity and the no-communication theorem. This work challenges the universality of this limit by proposing and analyzing two experimental setups that explore the role of quantum entanglement and indistinguishability in information transfer. By introducing modifications to the Mach-Zehnder interferometer and leveraging concepts from induced coherence, I demonstrate that the photon statistics observed in one subsystem are influenced by changes in a spatially separated subsystem. Through wavefunction calculations and quantum channel formalism, it is shown that under certain conditions, the system appears to bypass the constraints of the no-communication theorem. These findings raise the possibility of faster-than-light information transfer. Broader implications of these results are discussed, inviting further exploration into the interplay of quantum mechanics, information theory, and relativistic constraints, and offering a provocative perspective on the nature of communication and causality in the quantum realm.
Preprint
Full-text available
This article explores the concept of simultaneity in our reality by analyzing two central scenarios:-two events observed by a single observer, and-one event observed by two different observers. The focus is on light events, such as a lamp turning on and off. We explore the geometric and physical implications of the nature of simultaneity in these cases.
Article
Full-text available
Scientific principles can undergo various developments. While philosophers of science have acknowledged that such changes occur, there is no systematic account of the development of scientific principles. Here we propose a template for analyzing the development of scientific principles called the ‘life cycle’ of principles. It includes a series of processes that principles can go through: prehistory, elevation, formalization, generalization, and challenge. The life cycle, we argue, is a useful heuristic for the analysis of the development of scientific principles. We illustrate this by discussing examples from foundational physics including Lorentz invariance, Mach’s principle, the naturalness principle, and the perfect cosmological principle. We also explore two applications of the template. First, we propose that the template can be employed to diagnose the quality of scientific principles. Second, we discuss the ramifications of the life cycle’s processes for the empirical testability of principles.
Preprint
Full-text available
Standard special relativity postulates that the speed of light in vacuum is constant and isotropic. However, direct measurement of the one-way speed of light presents significant challenges due to clock synchronization issues at remote locations. In this paper, we propose an experiment based on the ratio of time dilation of two clocks launched in opposite directions from a common starting point on Earth. The clocks continuously record their proper time during their journey and transmit the data back to Earth. By comparing the recorded proper times, the effective speeds of light in each direction can be calculated. If the speed of light is isotropic, the ratio of time dilation will conform to the predictions of special relativity; any deviation may indicate anisotropy. This approach enables testing the one-way speed of light without the classical synchronization problems.
Article
Full-text available
No início do século XX, Henri Poincaré e Albert Einstein estabeleceram o princípio da relatividade, afirmando que as leis da física devem ser as mesmas para todos os observadores inerciais. Uma implicação desse princípio é que a informação não pode se propagar mais rápido do que a velocidade da luz no vácuo. No entanto, em 1825, o físico-matemático Pierre Simon Laplace havia demonstrado que a ação gravitacional deve se propagar 1 milhão de vezes mais rápido do que a velocidade da luz no vácuo. Esse fato gerou dúvidas e objeções ao princípio da relatividade. Embora Einstein só tenha abordado essa questão a partir de 1912, Poincaré a estudou cuidadosamente e apresentou os resultados de sua reflexão em 1905, em dois ensaios intitulados “Sur la dynamique de l’électron”. Poincaré mostrou que era possível conciliar o princípio da relatividade com a lei da gravitação universal. Neste ensaio, realizamos um estudo histórico das hipóteses de Poincaré sobre a gravitação, uma alternativa relativística que surgiu em 1905, mas que acabou sendo sobreposta pela teoria da relatividade geral. A análise das hipóteses de Poincaré sobre a gravitação oferece uma compreensão sobre o processo de construção do conhecimento científico e os diversos fatores que influenciam a escolha de um paradigma científico.
Article
Full-text available
In this article, I have tried to measure the distance that separates us from Kafka by trying to register both the things that we still have in common with him and his time, and the many things that have changed in between. The first section is an analysis of the story “A Visit to a Mine” in terms of the new accident prevention techniques instituted by the welfare state at the turn of the nineteenth to the twentieth centuries. The second section deals with the new concepts of time and space that emerged in the age of electric and electro-magnetic media. And the third section is an attempt to write a short history of imitation from Descartes to Darwin, Kafka, Turing, and, finally, to the Large Language Models that we now call Artificial Intelligence.
Article
Full-text available
We study the bouncing cosmology in the f (R, Lm) theory of gravity. In this model, we consider a specific function of f (R, Lm) which contains the higher order curvature term along with the matter Lagrangian. Here, we parameterize the Hubble parameter with the motivation that it creates a model with bouncing behavior and solves the initial singularity problem during the universe's early evolution. This model achieves a bounce instead, where the universe rapidly contracts and then quickly expands again. The EoS parameter crosses the quintom line in the vicinity of the bouncing position indicating significant changes in the energy density, isotropic pressure, and temperature in the model. The model is highly unstable near the bouncing point. Finally, we analyze the model's dynamic stability in the neighborhood of the bouncing point by examining the sound velocity and the adiabatic index depictions.
Chapter
This chapter is devoted to the presentation of the basics of quantum mechanics—one of the main branches of physical science, which studies the laws of the micro and nano world. These laws find their effective application in medical physics for the creation of modern precision methods of medical diagnostics. First of all, this concerns such widely used tomographic methods as X-ray computed tomography (CT) using a layer-by-layer scanning to obtain medical imaging due to the gradual passage of an X-ray beam through a thin layer of tissues of the human body; magnetic resonance imaging (MRI) based on the phenomenon of nuclear magnetic resonance (NMR); positron emission tomography (PET), being a method of medical radioisotope diagnostics based on the phenomenon of electron and positron annihilation. As in all previous chapters, not only a brief general description of the basic ideas underlying the tomographic methods of medical diagnostics (see Sect. 7.2), but also a strict mathematical and physical justification of basic theoretical provisions of quantum mechanics is offered to the attention of the reader. As well as the mentioned tomographic methods of medical diagnostics (see Sect. 7.3). In addition, in connection with the growing interest in the use of PET not only as a diagnostic method, but also as a therapeutic radioisotope method for the treatment of oncological diseases, in the last Sect. 7.3.4 the intriguing problem of the law of conservation of mass in the reaction of annihilation of matter and antimatter is discussed (in particular, electron and positron in the method of positron emission tomography), as well as the general most mysterious problem of natural science about the existence of dark energy and dark matter.
Preprint
This research claims that Absolute Rest is a unique state of motion of a subclass of inertial systems, not of Absolute Space which was described once as nonsense by Poincaré. Absolute Velocity measurements or effects of absolute motion are possible to demonstrate experimentally. The unsolvable problem of instantaneous synchronization has successfully been circumvented. This allows the direct demonstration or falsifi-cation of proposed conjectures about the detection and measurement of Absolute Velocity and the necessity of the existence of the absolute rest state. The Absolute Velocity of an inertial system is calculated assuming coordinates conforming to the x-boost configuration and plausible experiments are proposed. For completeness , this introductory contribution to the theory of absolute motion needs a generalization for an arbitrarily oriented Absolute Velocity vector. The demonstration of the absolute motion effects, however, is possible without this generalization. The impossibility of detecting absolute motion by experiments as a general law of nature claimed by Poincaré was an unproven inference. However, in usual measurement scenarios the Absolute Velocity effects, generally cancel out and it is rather difficult to find the ones that do not follow the cancellation rule. The Special Relativity Theory (STR) is not greatly affected by these findings; because it is not possible to define Absolute Velocity, given isotropy of one-way velocity of light set up by Einstein's assumption and the theory cannot contradict itself. The round trip average speed of light isotropy law is the most consequential foundation for theories of relativity including the STR, which implicitly includes this law. Another theory of relativity resulted from the re-derived Tangherlini Transformation (TT). It is a proven complementary theory but ignored as non-compliant with Lorentz invariance of the one-way velocity of light. By deriving the TT from first principles as presented in this paper using the postulate of round trip average speed of light isotropy and adding a new postulate of causal invariance of superluminal signals, the absolute motion can be proven and coincidently, the Tolman Paradox is also resolved.
Book
Full-text available
Scientific research has been the driving force behind technological advancements over the past five centuries. Yet, many people perceive the study of scientific methodology and philosophy as overly theoretical, causing them to lose interest in exploring this vital field. This book aims to bridge that gap, presenting science and its methods in a straightforward, accessible format, enriched with relatable, real-life examples. Whether you are a postgraduate student or an early-career researcher, this book is crafted specifically to guide and inspire you on your scientific journey
Article
Purpose The purpose of this article is to analyze how to make changes to prescribed curriculum, pedagogy, activities, learning environments, and assessments to loosen control over time and space of learning to increase time for student autonomy. Design/Approach/Methods A theoretical analysis of functions of the five elements (curriculum, pedagogy, activities, learning environments, and assessments) of schooling in controlling time and space in learning. Findings Students’ time available for autonomy is limited due to prescribed curriculum, which occupies all students’ school time. Pedagogical practices that aim to efficiently implement the prescribed curriculum force and tempt students to spend all available time on the prescribed curriculum and extend work on the prescribed curriculum beyond classrooms through time concentration. Furthermore, the design and implementation of learning activities alter students’ perception of time, making activities long and tedious. Moreover, students’ time occupation by the prescribed curriculum compresses their space in learning environments. Finally, assessments and evaluations are typically limited to a distorted picture of students due to misplaced observers on different time scales and from an observation perspective. All these can be changed and we proposed possible directions of change. Originality/Value This article furthers our positions in the previous article published in ECNU Review of Education—“Paradigm Shifting in Education: An Ecological Analysis.” In this article, we provided a more detailed analysis of schooling from spatiotemporal perspectives and gave a uniquely fresh perspective on changes we need in the age of AI to increase the time available for autonomy.
Preprint
Full-text available
This article critiques the homogenizing, colonial-modern conception of time as a passive, linear metric (relativist chronology) and proposes kālá sakti-a relationalist temporality rooted in South Asian philosophy-as an alternative. Kālaśakti redefines time as an agentive force shaped by the interplay of memory, karma, and creativity. Through mathematical models and decolonial critique, we argue that relativist time, exemplified by Einstein's equations, reduces temporality to a disenchanted coordinate system, divorced from ethical agency. In contrast, kālaśakti is modeled as a nonlinear field (ωε ωt = →↑ · (vω) + εω 2) where past actions (karma) generate momentum (K(t) = t →↑ k(ϑ) · M (ϑ) dϑ) to reshape present and future trajectories. By centering embodiment and ethical reciprocity, this framework challenges the colonial fragmentation of time from meaning and reclaims temporality as a transformative, lived experience.
Article
Understanding how time perception adapts to cognitive demands remains a significant challenge. In some contexts, the brain encodes time categorically (as “long” or “short”), while in others, it encodes precise time intervals on a continuous scale. Although the ventral premotor cortex (VPC) is known for its role in complex temporal processes, such as speech, its specific involvement in time estimation remains underexplored. In this study, we investigated how the VPC processes temporal information during a time interval comparison task (TICT) and a time interval categorization task (TCT) in primates. We found a notable heterogeneity in neuronal responses associated with time perception across both tasks. While most neurons responded during time interval presentation, a smaller subset retained this information during the working memory periods. Population-level analysis revealed distinct dynamics between tasks: In the TICT, population activity exhibited a linear and parametric relationship with interval duration, whereas in the TCT, neuronal activity diverged into two distinct dynamics corresponding to the interval categories. During delay periods, these categorical or parametric representations remained consistent within each task context. This contextual shift underscores the VPC’s adaptive role in interval estimation and highlights how temporal representations are modulated by cognitive demands.
Article
The core task of this paper is to demonstrate the heuristic merits of the Aristotelian philosophy of science as compared with the strict empiricism in constructing and justifying a unified theory of physics. The impetus for the study was the question of whether the success of the Dynamic Universe (DU) theory as a candidate for such a unification could be explained by energy as its basic notion (Suntola, T. 2018a, 2018b, 2020, 2021, 2022, 2024), while the other unificatory attempts (string theories, inflation theory, and loop quantum gravity), all based on the notion of force, appear to fail. DU’s reliance on Aristotle’s methodology of first principles and his potentiality-actuality metaphysics soon invited to explore DU’s Aristotelian presuppositions as an explanatory ground for its seeming success. A major weakness of empiricism is its rejection of metaphysical reflection, necessary for the revolutionary paradigm change in the unification project. While the empiricist negative stand towards metaphysics is based on its narrow conception of basis of knowledge and logical reasoning and the principle of methodological unity, Aristotelian solutions to these problems are difficult to refute. The Stagirite’s methodology of Saving the Appearances (SA), little known outside Aristotle scholarship, exposes ways of expanding the knowledge basis to make room for metaphysical knowledge. SA is valuable to our purposes here also by yielding a heuristic model for the discovery and justification of a unified theory of physics. Aristotle’s argument for the reality of potentiality in the form of an inference from a fact of life to its necessary presuppositions illustrates how to expand the empiricist premises-conclusion notion of logic. To specify the object of physics, the Aristotelian genus-species structure of reality exposes that the definition of the genus proximum constitutes the highest first principle of a theory. Applying Aristotle’s metaphysical notions of change and motion ( dunamis , substance–attribute, matter-form), the genus proximum in DU turns out to be mass as prime substance, mass defined as the substance for the expression of energy. To conclude, I shall point to the need for modifying the Aristotelian metaphysical categories to allow room for the holism in DU. Having studied the heuristic principles underlying the DU theory, the paper contributes both to the emerging studies of the Meta-Empirical argument forms in physics and the recent Neo-Aristotelian approach in the philosophy of physics.
Article
Global geometry and shape of the physical universe may be revealed by observing objects at large cosmological redshift z , since for small z the universe seems almost flat. Recent infrared measurements of the James Webb Space Telescope (JWST) indicate that there exist very luminous galaxies at distances z ≥ 13 that should not exist according to the standard ΛCDM cosmological model for the flat universe with curvature index k = 0. We introduce a spacetime-lens principle that could explain why these very distant galaxies shine so much. We show that the observed large flux luminosities may be mere optical effects due to the positive curvature index k = 1 of an expanding 3-sphere modeling our physical universe in time. For Euclidean or hyperbolic geometries such large flux luminosities seem implausible. This suggests that the right model of a homogeneous and isotropic physical universe for each fixed time instant is a 3-sphere. The standard cosmological model is based on the normalized Friedmann equation Ω M + Ω Λ + Ω k = 1, where Ω M + Ω Λ = 1 by measurements. We also show that this does not imply that Ω k = 0 and k = 0 as it is often claimed.
Article
Full-text available
In this paper, we show the existence of an invariant minimum speed in space–time by forming the basis of a deformed special relativity so-called symmetrical special relativity (SSR). Such observer-independent minimum speed emerges from Dirac’s large number hypothesis (LNH), which leads us to build SSR-theory. This allows us to understand that the hydrogen atom represents the most stable bound state in the universe as being a fundamental structure of the symmetrical space–time with two limits of speed, namely the speed of light c and a minimum speed V. Such minimum speed is associated with a background reference frame for representing the vacuum energy related to the cosmological constant. So the symmetry in space–time given by the invariant minimum speed, which has origin in the electrical and gravitational bound states in hydrogen atom is able to provide a deeper understanding of the proton-electron mass ratio, i.e. mp∕me=1836.15267343(11) given in terms of the universal minimum speed V. Furthermore, we investigate how the minimum speed in space–time plays a fundamental role in the quantization of energy in hydrogen atom.
Chapter
“Dear Colleague, I would be very pleased if next time you would write to me in Italian.
Article
One of the main issues in measuring the speed of light when it only travels from one spatial position into another position, known as the one-way speed of light, is that the clocks belonging to each separated spatial position are not and, in principle, cannot be synchronized. Here, we show that it is possible, in principle, to measure the velocity of particles that travel at the speed of light without assuming a round-trip once we adopt a quantum mechanical description under two boundary conditions to the state of the quantum system followed by the two-state-vector formalism while assuming non-synchronized quantum clocks with unknown time dilation. The weak value of velocity can be measured for a test particle that has a clock that is not synchronized with the clock of the quantum particle. Following the proposed set-up, when the weak value of the velocity is known even without knowing the time states of the system, such a weak velocity is the two-way speed of light. We further explore some fundamental implications of the set-up. The proposed approach opens a new avenue toward measuring the velocities of quantum particles while overcoming relativistic issues regarding the synchronization of clocks.
Article
Full-text available
In the context of superluminal physics, this article explores the latest advancements including the primary theoretical developments in special and general theory of relativity (STR & GTR). We have reviewed the significant strides made so far and addressed the difficulties encountered with purported interpretation. The proposed or derived transformation factors for superluminal particles can be categorized into two categories, i.e. , imaginary and real transformation factors. The transformation relations between the two classes of frames have been scrutinized using the metric expressions. The velocity addition and transformation relations of mass, momentum, and energy between the frames for superluminal particles have also been analyzed. The analyses using Klein–Gordon equation, which is the relativistic energy-momentum equation, for tachyons have been performed. In brane–antibrane system, universality of tachyon potential via tachyon condensation was articulated. The unstable brane system, by virtue of tachyon condensation, promotes the cosmological evidence. In advancing the pace of development of cosmological research, superluminal physics is one of the most astonishing fields. Approaching toward GTR for cosmological application, tachyons have been scrutinized in Schwarzschild, Kerr–Newman, and FLRW metric spaces. Tachyons in Friedman universe correspond to different applications, like measurements of Hubble’s constant, theory of big-bang, expansion of universe. Some of the experimental evidences provided by GRS 1915+105, SN 1987A, GRB 030329, GW 170817, etc., for the existence of tachyons have also been analyzed. Einstein’s postulate for the constancy of the speed of light has been commented on for the highest approachable speed observed by the particular observer.
Preprint
Full-text available
The analysis of the reflection of light from a moving mirror, as derived by Einstein, also applies to a corner reflector. This paper presents a Taylor-Fourier series expansion of Einstein's formulas and identifies a discrepancy in Dashchuk's second-order analysis, as described in ``Transverse Doppler effect with laser light in a reflection system,'' Proc. IEEE, 57, (12), 2148--2149 (1969). A geometric derivation of the reflection law, avoiding Lorentz transformations, is provided. Finally, a coordinate-independent construction for graphing the reflected beam's direction, given the incident beam and corner reflector velocity, is introduced.
Chapter
We show the basic principle of celestial mechanics. First, we state Newton’s three laws of motion. Next, we list various conserved quantities derived from the laws of motion. Then, we mention the law of universal attraction, which plays a key role in describing the motion of celestial bodies. Also, we rewrite the law as a gravitational field equation. Finally, we explain the transition from Newtonian mechanics to Einstein’s general theory of relativity.
X=Y=Z=L=M= 0 undN≠ 0 so ist aus Symmetriegründen klar daß bei Zeichenwechsel vonvohne Änderung des numerischen Wertes auchY' sein Vorzeichen ändern muß ohne seinen numerischen
  • . B Ist Z