International Journal of Cosmology, Astronomy and Astrophysics

ISSN: 2641-886X

Review Article

Everything is Probabilistic Spacetime: An Integrative Theory

Dennis M. Doren* and James Harasymiw

Lake Mills, Wisconsin 53551 USA

*Corresponding author: Dennis M. Doren, Retired, Lake Mills, Wisconsin 53551, USA, E-mail: dmdoren@yahoo.com

Received: September 25, 2021 Accepted: October 21, 2021 Published: October 30, 2021

Citation: Doren DM, Harasymiw J. Everything is Probabilistic Spacetime: An Integrative Theory. Int J Cosmol Astron Astrophys. 2021; 3(1): 130-144. doi: 10.18689/ijcaa-1000127

Copyright: © 2021 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Download PDF

Abstract

This article describes a new theory formulated to improve our understanding of cosmological phenomena. First, an enumeration is offered of shortcomings of current cosmological theories. Second, the components of the new theory are delineated. Third, the theoryʼs explanations of poorly understood cosmological phenomena are presented. Finally, numerous testable predictions are described that differentiate the theory from existing theories.

Keywords: Cosmology theory; spacetime; dark matter; dark energy; multiverse; probabilities

1. Introduction

All useful scientific theories are designed to summarize current knowledge and offer testable predictions in the pursuit of furthering our knowledge of a certain topic. Any failure within either role of a theory diminishes its utility. Even well-accepted theories are rightfully abandoned over time as newer observations remain unexplained or even contrary to predictions. Changes to the theories are then required, either in a patchwork process of modifications to the theory, or by a complete change of theoretical paradigm.

This article begins by delineating numerous failures of current cosmological theories and constructs. This collective set of findings serves as the rationale for the development of a new paradigm, one that more accurately and comprehensively describes the universe as it has been most recently observed.

Following the introductory enumeration of failures is the presentation of such a new theory. The new theoryʼs five principles are described. Explications of numerous recently discovered and poorly understood cosmological phenomena are then presented, demonstrating the first requirement of a useful theory. A set of testable predictions from the theory are then offered, demonstrating the theoryʼs other requirement for utility.

2. Failures of Current Theory and Theoretical Constructs

The following describes shortcomings of currently popular theories and theoretical constructs. This review serves as the rationale for why a new cosmological theory should be considered.

2.1.Theories: String theory and M-theory

String theory, which has been studied for about 40 years, is an attempt to reconcile general relativity (and its formulation of gravity) with quantum physics. The theory states that all objects in the universe are composed of vibrating filaments (one-dimensional strings) of energy. M-theory (first presented in 1995 by Edward Witten) postulates a similar idea, that the most fundamental entity in the universe is a membrane of energy (typically thought of as a two- or five-dimensional entity, but the concept of a “brane” an be of any dimension). Both these theories view their smallest entity as sitting in a background of spacetime.

String theory and M-theory both require several (usually hidden) dimensions beyond our usual three plus time. Most string theories, and M-theory also state there is a connection between bosons and fermions called supersymmetry. Supersymmetry is the idea that for every known boson or fermion there is a corresponding “superpartner”, the spin of which differs by a half-integer.

String theory attempts to account for gravity through postulating the existence of the “graviton” at certain string vibrations, effectively a quantum entity that transmits the force of gravity. Depending on the permutations in the attributes considered, string theory can also postulate parallel universes, the holographic principle (where the information in a space can relate to information on the surface of that space), and the anthropic principle (where the fact humans exist can be used to explain certain physical properties of the universe).

Although these theories have received a great deal of attention, research has not been supportive. Experiments at the Large Hadron Collider have failed to find evidence of supersymmetry [1][2]. To date, there is no evidence for the existence of a particle that transmits the force of gravity (the “graviton”). Likewise, there is no research evidence support for M-theory [3]. While having some mathematical backing, both types of theories have failed to date to gain empirical evidence.

Of relevance to the new theory presented later in this article, both string theory and M-theory describe the most basic entity in the universe (strings or branes) as existing within the background of spacetime. Spacetime plays a passive “container” role. Not all current theories view spacetime in this way.

2.2.Theory: quantum loop gravity

The theory of quantum loop gravity (QLG) is an attempt to bridge the gap between general relativity and quantum mechanics. Within QLG theory, spacetime is not a container but a granular structure of tightly intersecting loops. This quantum structure of spacetime is what allows the characteristic of gravity when in contact with mass.

From the perspective of this review, a weakness of the QLG theory is that it is not a full cosmological theory. It was designed for a far more specific purpose, as stated in the first sentence to this section. Hence, at best, it is quite limited in its ability to explain a vast number of cosmological phenomena beyond gravity. To its credit, the theory requires, and studies have indicated the appropriateness of a positive cosmological constant [4][5], a requirement that is consistent with the observation the universe is expanding.

2.3.Theoretical construct: Dark matter

Research findings are consistent in indicating there is more gravity in the universe than we can account for based on observable matter. Stars on the outer portions of spiral galaxies orbit too quickly around their galactic centers. The outer stars should project away from their galaxies if held in place only by observable matter. The degree to which gravitational lensing occurs is also greater than supported by known sources of gravity. This paucity of sources of gravity is also true in the formation and evolution of galaxies, their motion within galaxy clusters, and even the location of mass during galactic collisions [6]. The idea there is more gravity in the universe than can be accounted for by observable mass is substantially undisputed.

Theorized as the source of this extra gravity have been multiple conceptualizations of “dark matter”, a hypothesized type of matter that interacts with nothing else except through gravity. The most common ideas for dark matter particles are weakly interacting massive particles (WIMPs) [7] and axions [8]. Many experiments to detect dark matter particles directly have been undertaken, but none has succeeded [9][10]. Null findings are not proof something does not exist, but two decades of null-finding empirical outcomes would seem to suggest it is time to move on.

There is also contrary evidence to various conceptualizations of dark matter. Research studied synchrotron (radio) and thermal (X-ray) emanations from mass filaments between galaxy clusters using filaments up to 50 million light years in length [11]. Findings indicated that none of five models of dark matter could account for more than a small portion of the observed emanations, whereas the more mundane phenomenon of magnetism could account for far more. The researchers concluded that dark matter could not be completely ruled out as related to the emanations but the lack of support for any of the various tested conceptualizations was dramatic.

2.3.1. Modified Newtonian Dynamic (MOND) theories as an alternative to “dark matter”: The main alternative to dark matter formulations in explaining where “the extra” gravity can be found are the modified Newtonian dynamic (MOND) theories, more generally known as modified gravity theories [12]. The essence to these theories is that the mathematical relationship among mass, distance, and acceleration needs to be altered when accounting for very small accelerations. Some success was initially found with proposed formulaic alterations (such as by using the square of centripetal acceleration, versus the centripetal acceleration itself as in Newtonʼs second law) when measuring the gravitational effect on a star near the edge of a galaxy. Recent findings involving gravitational waves, however, have been contrary to many (though not all) MOND theories. MOND theories typically predict that the speed of light and the speed of gravity would differ, but those speeds were found to be the same [13]. MOND theories also seem successful when applied to galaxies but not to galaxy clusters and hence need further theoretical modifications to be viable [14].

2.4.Theoretical construct: Dark energy

The universe is expanding, and at an increasing rate. The name commonly given to the cause of this expansion is dark energy, though no consensus and no empirical evidence exists about what dark energy is or how it works. The nature of dark energy is completely unknown.

Theory and empirical evidence strongly indicate the expansion of the universe occurs through a metric change (i.e., the expansion occurs through ongoing changes in distances between points rather than through any type of acceleration of objects themselves). Theory also mandates that the expansion remains consistent with the cosmological principle (i.e., the expansion occurs evenly across the universe when viewed on a large enough scale). How a “repellent force” (the usual descriptor of what dark energy is) can exist throughout the universe and exist so evenly is completely unknown. “Dark energy” is simply the name given to describe “whatever it is” that drives the expansion of the universe.

3. Unexplained Empirically Documented Phenomena

The following are examples of recent empirically determined observations that were not anticipated by mainstream cosmological theory. These examples therefore illustrate further shortcomings of current theory.

3.1.The Hubble tension

For much of the past decade, empirical findings concerning the rate of expansion of the universe (called the Hubble constant, Ho) have not been supporting the cosmological principle. Ho has repetitively been measured at one of two different values depending on where and how the measurement was made, not the single value supposedly mandated by the cosmological principle [15]. The idea that both values could be correct while maintaining the cosmological principle has not been incorporated into current cosmological theory.

3.2.Supermassive black holes in the early universe

Supermassive black holes (SMBH) are believed common in the universe. Current theory says that most galaxies have a SMBH at their center. These SMBHs are thought to have developed initially through the collapse of a large star, and then followed by the absorption of surrounding mass and collisions with other black holes. The universe is old enough to allow for the growth of most SMBHs through the combination of these phenomena.

That time frame is insufficient, however, in explaining the SMBHs that have been found in the early universe. There simply was not enough time to grow these early universe SMBHs using only the previously stated mechanisms, despite the fact the early universe was significantly denser than it is today. Some of the discovered very early SMBHs have a mass equivalent of over 1 billion suns [16], with the largest known primordial SMBH (named J0313-1806) having a mass equivalent of 1.6 billion solar masses. It has been calculated that J0313-1806 would have needed a starting mass of at least 10,000 suns [17] but the largest seed black hole from the collapse of an early massive star or star cluster can only be up to a few thousand suns in mass. As the discoverers of this massive black hole wrote, “the existence of such a massive SMBH just ∼670 million years after the big bang challenges significantly theoretical models of SMBH growth.” These SMBHs in the early universe are too massive to be explained using the star collapse, mass absorption, and black hole collision mechanisms [18].

To address this shortcoming in our understanding of very early SMBHs, there is need to discover a different mechanism altogether. One such mechanism has been suggested: the direct collapse of a massive nebula into a black hole, thereby skipping the usually intervening formation of a star, the starʼs burning its fuel, and only then collapsing to a black hole and beginning its growth to supermassive size. There are two different mechanisms hypothesized for this direct collapse to a black hole to occur.

The first involves a single very large nebula that, due to its own mass and gravity, collapses directly to a black hole. The problem with this direct collapse concept is that the hypothesized nebula cloud needs to have had three specific conditions which in combination are thought quite rare [19] [20] though a few candidates have been discovered [21]. The finding that the necessary factors rarely occurred seems to make this mechanism for a direct collapse quite unlikely as a general explanation for very early SMBHs.

A second formulation of the direct collapse hypothesis avoids the rare requirements. This formulation involves two early very large nebulae, one with stars and one without, with the latter being accreted very rapidly into the former. Based on simulations, such a process causes the former nebulae to collapse into a massive black hole [22]. To date, however, no observations have been made suggesting this process ever occurred, no less many times.

Hence, in total, there is no generally accepted understanding for the development of SMBHs in the early universe. But we know they were there.

3.3.Filaments have angular momentum

We have known for quite some time that at least most things in the universe have angular momentum (spin). That general understanding has recently been extended to the largest structures in the universe: cosmic filaments. Research found that galaxies both orbit the centers of their filaments in corkscrew-like helical orbits and fall towards the galaxy clusters at the end of each strand. Additionally, filaments that ended at more massive clumps of galaxies seemed to rotate faster [23]. There is no generally accepted explanation as to why it seems all massive objects in the universe rotate.

3.4.Magnetism in intergalactic space

As described above, evidence now exists of magnetic field lines stretching between galaxy clusters [11], that specific research involving filaments up to 50 million light years in length. (This discovery followed the discovery a year before of what had been the largest magnetic field known, 10 million light years, spanning the entire length of a filament [24].) The researchers postulated that this magnetism is a remnant from the big bang.

Very recently, mathematical physicists were able to show that Maxwellʼs equations describing classical electromagnetism and Einsteinʼs general relativity field equation could be portrayed by a single equation encompassing both electromagnetism and gravitation [25]. Their conclusion was that electromagnetism is a property of spacetime itself, a conclusion that seems to indicate that magnetism exists everywhere in the universe (though we may not yet have the technology to detect it).

Both the empirical findings involving filaments and the mathematical determination indicate that magnetism is quite prevalent in the universe, maybe everywhere. The conclusions drawn from the observational versus mathematical researchers, however, differ in that the first set of researchers expressed the idea that magnetism in spacetime is likely primordial in origin (i.e., caused by events very soon after the Big Bang, and now residual) while the second set of researchers viewed spacetime magnetism as inherently present everywhere no matter the specific events immediately following the Big Bang.

Given the recency of these empirical results, there has not yet been resolution concerning these different views of the source of the universeʼs magnetism. Theories concerning the universeʼs magnetogenesis date back to 1973 [26], though our ability to demonstrate the pervasiveness of magnetism in the universe has only very recently begun to open the door for testing different theories. It would appear, though, that any useful cosmological theory would need to account for the prevalence of magnetism throughout the universe.

4. Addressing Current Theoryʼs Failures: A New Theory

The following describes a theory, entitled the “probabilistic spacetime theory” (PST), that attempts to address the failures of theory and theoretical constructs enumerated above, incorporate very recent empirical discoveries, and make predictions that will be useful for future research endeavors (i.e., that show the theory is testable). There are five central principles to the PST, presented below with an accompanying brief explanation. More expansive discussion on how the components work together is offered in the section afterwards where the theory is used to explicate the existence and mechanisms of various cosmological phenomena. Being that the PST is a theory that integrates the findings from many empirical, observational, and analytic sources, the mathematical bases for various assertions of the theory stem from other researchersʼ work and are referenced as such throughout this article.

4.1.Principle 1: Spacetime is the fundamental entity of the universe

Presumed in string theory and M-theory is that spacetime is the container for strings or branes, and that strings or branes are the fundamental entities that bring us our macro life. That shared perspective, that fundamental characteristics are separate from spacetime, is seen from the perspective of the PST as a significant flaw for those theories. The PST rejects the presumption that spacetime is just a container for what is truly fundamental.

The primary principle in the PST is that spacetime is an energy field composed of the most fundamental entities in the universe. These entities are amorphous segments of energy of all types. The energy that is spacetime is fundamental in that everything else in the universe derives from it (as will be described further below). Any single quantum of spacetime, a single “probability”, is essentially a probabilistic portion of anything we can observe. A “probability” can be the fundamental essence of a virtual particle, a charge, a spin, or mass. (That the characteristics of charge and spin can exist separate from elementary particles has empirically been demonstrated [27][28][29]. Sometimes described as ghost particles, chargons and spinons seem to exist independently from the particles to which their characteristics are usually attributed. All that is being said here in that regard is that this separation exists at the most fundamental level in the universe.) Probabilities have no specific form (unlike a string or a brane), maybe better being conceptualized as a mathematical (wave) function versus a measurable reality. A probability never involves zero energy (see principle 2 below), but its energy can be as low as asymptotically approaching zero. At a high energy level, a probability can phase into mass (as described in principle 4 below).

As stated above, each probability is only a fragment of the types of things we find observable. Probabilities do not exist in the same way we think of even the elementary particles existing. Analogous to how strings of string theory are conceptualized as being of a lesser type from the rest of our reality (for strings, being of a smaller dimension), probabilities are of a lesser type from the rest of our reality. That is why they were given the label “probabilities”, to indicate their “greater than zero but less than whole” nature. They are only fundamental segments that compose the rest of what we know and are not accurately conceptualized as smaller forms of our larger reality.

Spacetime, herein reconceptualized as the probability field, is theorized to be of a dual nature. It is both quantum and wave function. This is meant as completely analogous to what has been empirically demonstrated concerning the duality of photons [30]. And unless the local probability field goes through a phase change (sometimes inclusive of symmetry breaking, as described under principle 4 below), the field remains probabilistic in nature. Since probabilities are wave functions (as well as quanta), the probability field necessarily involves a constant exchange of probabilistic energy across the quanta of the field. Each wave function is constantly exchanging energy across the rest of the field. (Analytical research has shown that a spacetime plane wave cannot have uniform energy density [31] ). As with an electron cloud, the exact location of a probability is always nebulous with scattered likelihoods spread across the wave function. The overlapping of probability wave functions with a neighboring one is essentially constant, though the degree to overlap (which exactly equates to their sharing of their energy) constantly varies given all energy involved is probabilistic. Spacetime is a constant sharing and swirling of this probabilistic energy. (The mathematical justification for this statement is referenced near the end of section 4.4.3 below.)

4.1.1. Comparisons to other spacetime conceptualizations

4.1.1.1. Wheelerʼs quantum foam: The above description may sound the same as Wheelerʼs view that over sufficiently small distances and sufficiently brief intervals of time the “very geometry of spacetime fluctuates” [32]. The PST and Wheelerʼs hypothesized concept of the “quantum foam” of virtual particles agree relative to the existence of a universal sea of virtual particles. That is where the similarity ends, however. Wheelerʼs conceptualization was that spacetime is comprised of quanta, and the quantum foam was that which served to bring the observable universe into existence. The PST views virtual particles as derived from a still more fundamental energy of spacetime that is both quantum and wave function in nature. Moreover, virtual particles are seen neither as the cause of nor a necessary step in the formation of the observable universe.

4.1.1.2. Silverberg and Eischenʼs field theory: The PST concept of spacetime as fundamental and composed of energy segments has similarities to the field theory postulated and investigated by Silverberg and Eischen [33]. Those researchers explored a field theory that (a) recognized vector continuity as a general principle, (b) conceptualized space-time as a 4D energy vector field, (c) found the vector continuity equations reduced to wave functions, and (d) “fragments of energy” excited the local vector field. The researchers found utility in their field theory in that it successfully predicted both the precession of Mercury and the same bending of light as predicted by general relativity.

The PST overlaps the foundation laid by the Silverberg and Eischen field theory (FT). Their metric tensor is described quite similarly to that in the PST: “We refer to the whole as the 4D energy vector field or just as energy. We shall refer to the component parts of the whole as fragments of energy” (p. 490)….“As building blocks, the fragments of energy depart from the particle and the wave conceptions. The particle is a source located along a space-time line and not elsewhere, whereas the wave is missing the source and it exists everywhere else. In contrast, the fragment of energy has both a source point and it exists everywhere else” (p. 497). As stated above, the PST posits that probabilities (which are energy) are both quantum and wave function in character.

Silverberg and Eischenʼs metric tensor “fragment of energy” is described as having no shape. Again, as stated above, the PST quite specifically posits no shape to its metric tensor of a probability. The reason both are indefinite is because of their nature, being simultaneously both quantum and wave function.

Likewise, the Silverberg and Eischen field theory uses a 4D flat metric (as opposed to general relativityʼs 4D curved metric or the Newtonian 3D flat metric). The PST agrees with the description of their theoryʼs metric (quoted from p. 498):

“ds2 = dx12 + dx22 + dx32 +dx42 (dx42 = -c2dt2)
[rectangular coordinates]

ds2 = dr2 + r2 (dθ2 + sin2θdϕ2) – c2dt2
[spherical coordinates]”.

There are fundamental differences in the theories, however. First, the PST does not view the relevant field as something that “blankets space-time” (p. 490) or “drape(s)… over the space-time domain” (p. 491), but as spacetime itself. The PST views the “fragments of energy” (what the PST calls the probabilities) as truly fundamental to everything in the universe while Silverberg and Eischenʼs FT sees such fragments as sitting within a background of spacetime. Second, the reduction of vector continuity equations to wave functions was quite important to demonstrating the utility of the Silverberg and Eischenʼs FT, but from the PSTʼs perspective this is only part of the full description of the quantum universe. As will be described in detail much below, the quantum nature of spacetime (what Silverberg and Eischen called “a source point”), versus its wave function nature, is of high relevance in explaining phenomena such as the expansion of the universe, the rotation (angular momentum) of large bodies of mass, and why black holes cannot be singularities.

In summary, although the PST and the Silverberg and Eischen FT overlap, the PST goes further in describing the fundamental essence and mechanics of spacetime as well as explicating a host of cosmological phenomena to which the FT has not been applied.

4.1.1.3. Quantum loop gravity: Spacetime as quantum is postulated by quantum loop gravity theory. The PST requires spacetime also to be treated as composed of wave functions and not necessarily looped. The theory of quantum loop gravity, being limited to its purpose of bridging general relativity with quantum mechanics, does not involve fundamental entities with charge, spin, or any other consideration within the Standard Model.

4.2.Principle 2: Once a quantum of probability field exists, it cannot be destroyed

Once a quantum of the probability field exists, the first law of thermodynamics applies. Its energy cannot be destroyed. Since each quantum of the field is energy, and does not just contain it, no quantum of the field can be destroyed once it comes into existence. The energy in any given quantum of the field can increase and decrease, but never be brought to zero. The principle that quanta of spacetime cannot be destroyed is of importance in addressing issues such as black holes and the future of the universe, as described later in this article.

4.3.Principle 3: All fields are derivative from the probability field

As described above, the probability field is a constantly swirling field of energy. Each component of that field, each probability, is probabilistically related to our larger reality, but not composed of any complete form of our larger reality.

As mentioned above, the charge of a particle can be an independent characteristic from the particle itself (as is the spin) [27][29]. Therefore, the field, as a swirling set of probabilistic wave functions, necessarily involves the constant churning of some charged entities (among all other types of entities).

This constant swirling of probabilistic wave functions with a charge causes electromagnetism, everywhere there is spacetime. Worded another way, the PST posits there is an electromagnetic field everywhere in the universe, directly derived from the nature of spacetime itself. (This article was already being written when Lindgren and Liukkonen published their finding that Maxwellʼs equations concerning electromagnetism and Einsteinʼs equations from general relativity are linked [25]. Those investigators concluded “our research shows how electromagnetism is an inherent property of spacetime itself….Electric and magnetic fields represent certain local tensions or twists in the spacetime fabric”. That conclusion is very close to what the PST states). This magnetism does not require charged particles moving through space, just the movement of charged probabilistic energy that reflects the swirling of spacetime itself.

That all fields are derivative from the probability field is true of the Higgs field as well. The existence of this particle, a requirement of the Standard Model, was demonstrated about 50 years after being hypothesized by Peter Higgs and colleagues. The empirical discovery of the Higgs particle was a major accomplishment, with the presumption regularly being made that the particle (or field) is something contained in spacetime as opposed to simply being a characteristic of it, or directly derived from it. The PST interprets the empirical support for the Higgs field otherwise. The PST shares the idea that there is a critical process in the local system energy to cause mass (described relative to the Higgs field as symmetry breaking). That this process can happen distinctly from the characteristics of spacetime itself is where the PST differs from descriptions of the Higgs field. The generation of the property of mass associated with the Higgs field is instead seen resulting from sufficient energy in one local volume of the probability field. In other words, the empirically supported workings of the “Higgs field” are accepted, but the underlying mechanism is not seen as ultimately distinct from what spacetime does itself. The PST states there is no separate Higgs field except as derived from the local probability field. The symmetry breaking attributed to the Higgs particle reflects the phase change of the probability field; the local fieldʼs becoming singularly defined as compared to its baseline state of probabilistic energy. See principle 4 below for more detail about and mathematical derivation of phases of the probability field.

4.4.Principle 4: The probability field has phases

As indicated above, the probability field is always dynamic. Its degree of energy in any given location constantly varies. Like macro systems, the probability field also has phases depending on the energy within a given volume of the field.

4.4.1. Baseline: Its least energetic, baseline phase is what we typically think of as “space”. As in macro systems, there is a range in the energy of a volume of the field in which it remains in a baseline state.

The swirling nature of that energy resulting from the overlap of probabilistic wave functions, guarantees that the higher end of that baseline range occurs with great frequency throughout the probability field. These higher energy variations of the local field are the fluctuations we describe as virtual particles. These fluctuations are all part of the fieldʼs baseline state.

4.4.2. Massless gauge bosons: In contrast, a complete phase change occurs at a higher intensity of probabilistic energy within a local volume of the probability field. With sufficient probabilistic energy, the massless gauge bosons (photons and gluons) form from the fieldʼs energy itself.

With the formation of those bosons comes a more complete mechanism (compared to the baselineʼs swirling probabilistic energy) for the transmission of the electromagnetic force and the strong force, as is the nature of the massless gauge bosons. Photons only come into existence with greater local energy than is needed for the development of the electromagnetic field (as the electromagnetic field develops while the probability is still in its baseline state) and thereby photons act to cause an increase in the electromagnetic fieldʼs ability to transmit its electricity and magnetism.

In contrast, the strong force transmitters (gluons) come into existence through a phase change involving less energy than the fermions between which the force is typically transmitted (because fermions have mass while gluons do not). Due to this, gluons can exist without the presence of those fermions. Since gluons interact with each other, they have been hypothesized to form what has been called “glueballs” when not interacting with fermions. The existence of glueballs was initially thought to have been demonstrated in 2015 [34], though empirical evidence of glueballs was first reported in August 2021 [35]. The idea that gluons can form glueballs is incorporated in the Standard Model, but the PSTʼs conceptualization of gluons being derived from a phase change of spacetime itself is new.

4.4.3. Mass: Another set of phase changes occur at still higher concentrations of probabilistic energy. These phases involve the probability field forming the gauge bosons with mass, and all the fermions. To accomplish the formation of mass, the field must use what has been termed the Higgs field or boson. From the perspective of the PST, the formation of mass is the result of a phase change in the probability fieldʼs energy. When enough probabilistic energy is within a local volume of the field, that energy (being at least equal to the Higgs boson) phases into an object with mass. (This is a different emphasis, but conceptually not different from the theory of symmetry breaking associated with the Higgs. Symmetry breaking is viewed as a type of phase change, where an amorphous state becomes uniquely defined). It is presumed that different amounts of probabilistic energy (coupled with varying interactions with available gluons) are necessary to form the particles with mass in the Standard Model, that energy always being above the energy of the Higgs.

As a basis to the above assertions, the PST borrows from the analytical work that demonstrated a spacetime wave can reproduce the observed properties of quantum matter [31]. That analysis starts with a wave with quantized vibrations of spacetime, finds that a spacetime plane wave cannot have a uniform energy density, and because of this matter appears in the spacetime wave as point particles. The concept of phase changes was not specifically mentioned in that analysis, but the mathematical representations used are in keeping with the PST and hence are referenced here. Specifically, the PST phases are indicated in that analysis when it references energy thresholds for the materialization of mass:

[if] the wave does not have sufficient energy for a particle of mass m0 to materialize in V0. We can say that within V0 this wave has a potential point-like vibration in proper time with amplitude Δt0<1/ω0, but the particle cannot materialize fully because of the quantization rule (4.7). In order to observe one full quantum of mass m0, we need a volume V containing sufficient energy: V = V020Δt20” (p. 9). And the…“probability of [quantum mass] materializing at any given point in V0 depends on the coefficient a0 of the field function” where a0=Δt00 [ω is the angular frequency of a proper time vibration]…. only space and time are needed to describe the propagation of a free particle. Energy and momentum are no longer separate quantities from space-time” (p.10).

That studyʼs finding that energy and momentum are not separate quantities from spacetime has been replicated numerous times. Examples include investigations of teleparallel gravity as the alternative to general relativity [36] [37], and energy and momentum as properties that only exist in relationship to spacetime structure [38].

Phrased in PST terms, the above findings support the assertions that (a) spacetime necessarily involves energy, (b) particles cannot materialize out of spacetime without sufficient local energy, (c) quantum mass can materialize when enough energy is present, and (d) the amount of energy at any given location constantly varies. Additionally, spacetime necessarily involves the characteristic of momentum along with its energy, a characteristic the PST operationalizes as the constant swirling of wave functions.

4.4.4. Superfluid?: The most energy-intense phase of the probability field likely only occurs within black holes and neutron stars. The common description of what happens within a black hole is the gravitational force becomes so great that nothing can stop the progression towards a singularity. From the perspective of the PST (when coupled with the first law of thermodynamics), however, nothing can destroy a probability once it has come to exist. Hence, the PST mandates that the “bottom” of a black hole must still be finite. Every probability that was within the gravity well (i.e., the black hole) still must exist.

Their phase, however, would be different from any of the others already described. Due to the crushing of the probability fieldʼs energy (i.e., wave functions) into smaller and smaller volumes, the threshold to phase into mass would have been reached long before the fall into the black hole ceased. At the same time, as general relativity indicates, no mass can withstand such pressure. The very highly concentrated probabilities instead become nearly completely overlapping wave functions. As such, all independence becomes extremely blurred but never eliminated. What remains would seem to be in keeping with the phenomenon proposed by Migdal [39] as being at the center of neutron stars: a superfluid. Even if this is not an accurate description of this final phase state of the probability field, it seems clear there is a phase transition of the probability field in black holes much beyond what we typically see as mass.

4.5.Principle 5: Derivatives of the probability field cause it to be self-attractive

The probability field is composed of fundamental bits of energy. Directly derived from the probability field is an electromagnetic field and virtual gauge bosons. The electromagnetic field brings magnetism to everything in the universe and generated (virtual) photons facilitate the transmission of that magnetism.

Both derivatives from the probability field attract the charged energy of the field from which the derivatives are generated. Through this mechanism, from the probability field to its derivatives and back to the probability field, the field can be described as self-attractive. The probability field is self-cohesive. (This characteristic has been described in research investigating “dark matter” as its being interactive with itself [40]. The PST does not accept the existence of dark matter but posits a relatively high energy probability field volume working in a similar way. This substitution is described in the next section).

The greater the local energy of the probability field, the more likely is the generation of gauge bosons and electromagnetism, and the stronger is the cohesion across the local field. And when that local energy is great enough to generate or maintain the phase of mass, the surrounding field generates increased cohesion close to that found in mass as well.

5. Utility of this Theory: Ability to Explain Phenomena

5.1.High energy clumps of the probability field instead of dark matter:

The PST finds no reason to hypothesize a new entity, dark matter, to explain the “additional gravity” observed in various astronomic phenomena. Instead, the additional gravity seen in galaxies and galaxy clusters beyond what is associated with their mass is the result of the cohesiveness of spacetime itself, and not WIMPs, axions, or any other construct thought to constitute “dark matter”.

General relativity describes gravity as the effect of the curvature of spacetime caused by mass and energy. The PST fully agrees but expands on the possible source of such energy. The PST posits that the curvature of space time is caused by mass and radiation but also the very nature of spacetime itself: probabilistic energy. Spacetime is not just a container that warps in shape due to mass and energy but is an active part of every system involving mass and radiation.

As described in principle 5 above, there is an ongoing interaction between the probability field and itself, and this is especially true when in its higher energy phases. This quite commonly results in a halo of relatively high energy probability field surrounding mass. That surrounding field shares a cohesion with the proximate mass that is greater than is typical within lower energy portions of the field. And because the field surrounding mass involves a relatively high degree of energy (or maybe better phrased as energy density), that portion of the field acts as an additional degree of curvature to spacetime; what equates to additional gravity. Hence, there is more gravity in most any system than can be accounted for just by the observable mass.

Put simply, the energy of spacetime tends to be greatest (densest) around mass. And that “clumping” of energy equates to more gravity than can be accounted for by the mass alone.

For halos around large bodies of mass, the gravitational curvature of spacetime from both the mass and the halo serves to increase the cohesiveness of the two. That mutual gravitational attraction works simultaneously with the virtual gauge bosonic and electromagnetic factors to ensure that nearly all large bodies of mass will have a surrounding halo consisting of high energy probability field. (To be clear, the PST posits that there is a probability field halo surrounding all mass, not just large bodies. We just have yet to develop the methodology or technology to see the effects of “the spacetime halo” surrounding smaller bodies of mass).

This cohesiveness to mass does not mean a halo is glued in place. In keeping with the concept (derived from Einsteinʼs field equations) that gravity has momentum, the probability field halo energy has momentum. (This assertion also reflects the analytic findings about the lack of independence of spacetime and energy-momentum cited in section 4.4.3 above.) This has been observed multiple times when galaxies collide [41][42][43] though this observation has been labeled as proof of “dark matter”. After galaxies travel into one another, their “dark matter” halos continue to travel in the galaxiesʼ original directions of movement. The “dark matter” halos pass through one another (without being slowed by the impact) even after the observable portions of the galaxies themselves had already changed direction in the process of falling towards each other [42][43]. What was termed “clumps of dark matter” has even been discovered quite distant from the observable galaxies presumably from whence they came [41].

How “dark matter” can exist independently from their galaxies remains a mystery for dark matter theorists. From the perspective of the PST, the high energy probabilistic field halos demonstrate the same momentum described by Einsteinʼs field equations. The halos moved away from the observable galaxies they surrounded once the mass of galaxies collide. Becoming separate from their original galaxies only required their momentum to be greater than their cohesion to the proximate mass. These high energy portions of the probability field then continue to exist as a cohesive clump of the probability field independent from mass due to the clumpʼs own selfcohesiveness as described previously.

The Harvey et al. study [43] also concluded that “dark matter” was not in particle form. This latter conclusion was based on how the halos traveled through one another. That studyʼs finding is consistent with the PST when the theory states that there is no particle of “dark matter” – just spacetime itself.

Overall, the PST offers a source and mechanism for the additional gravity needed to explain various astronomic phenomena, does it without positing a yet-to-discovered substance beyond spacetime itself, and is more consistent with empirical findings than is at least the particle concept of dark matter.

5.2. Inpouring of the field instead of “dark energy”

The principles of the PST dramatically narrow the options for explaining the ongoing expansion of the universe. As the universe expands (increases in volume), there necessarily is more spacetime. And because the PST states that each quantum of spacetime is energy, adding new spacetime necessarily means adding energy to the universe. Since the first law of thermodynamics forbids the creation of energy out of nothing, any explanation of the expansion of the universe means explaining from where the added spacetime and its energy derives.

Examining the expansion of the universe by starting with an investigation of the added energy avoids the issue of a repellant force called dark energy. Even if such a repellant force exists, the issue would remain of how the new spacetime energy comes to exist. (The well accepted and empirically supported idea that spacetime anywhere can develop virtual particles speaks to there being energy everywhere, even if the concept of a probability field is rejected in favor of something else such as quantum loop gravity. Likewise, the abovedescribed consistent finding that spacetime and the energymomentum tensor are inseparable suggests that the creation of new spacetime must involve additional energy.) The energy of new spacetime must come from somewhere.

5.2.1. E = mc2: There are only three options. One is that the additional energy is derived by converting existing mass. There is no evidence for this. This option would, in fact, seem contraindicated by the finding that the expansion of the universe is metric, occurring everywhere including in voids. Likewise, the fact the expansion is occurring at an increasing rate despite the distance between large bodies of matter growing ever greater speaks to the implausibility of mass being the source of the new energy during the expansion.

5.2.2. Original energy spread very thin: The second option is that the existing probability field creates new probabilities (spacetime quanta) by spreading its limited energy thinner and thinner (and doing so at an accelerating rate over time) to create the new portions of the field. For this to be true, the field would need to have a mechanism to self-propagate new probabilities and to do this with incredible efficiency. The increase in the probability field for even just the observable universe went from something measured in cubic millimeters at most (immediately after inflation) to todayʼs (observable) spacetime volume of 3.566x1080m3 [44] using only the original amount of spacetime energy. And this would need to have occurred without leaving any significantly different feature in spacetime over the eons. And during those eons, increasingly less energetic wave functions would have needed to generate an increasing number of new wave functions. Far more likely, as the existing wave functions became less and less energetic it would seem inevitable that their ability to propagate more and more new wave functions would become lesser, not greater. The idea that probabilistic entities have been increasing their rate of self-propagation everywhere in the universe for billions of years by persistently acting contrary to their steadily decreasing energy seems beyond highly implausible.

5.2.3. From outside: If the energy for the new spacetime does not come from existing mass or energy, and the first law of thermodynamics has always been true throughout the universe, there is only one remaining option to explain from where the new spacetime energy comes. The additional energy comes from outside our universe. Interestingly, as described in multiple sections below, an external source of spacetime also explains cosmological phenomena besides the expansion of the universe.

What can we tell about an external source of spacetime? The expansion of our universe being metric seems of relevance. The new energy needs to be arriving in our universe at essentially all points. For that to occur, the external source of probabilistic energy field must surround our universe and be of greater dimension. A 5+-dimension multiverse (or single other universe of 5+ dimensions) that surrounds and contains all our 4-dimensional universe would remain in contact with all points of our universe, no matter how much our universe expands. That would be true indefinitely, as no 4-dimensional volume once contained by a completely overlapping 5+-dimensional volume can fill that 5+dimensional volume.

Given the external source is in contact with all points of our universe, and inpouring of new spacetime (probability field) occurs everywhere in our universe, it would seem the rate of expansion would remain the same everywhere throughout time. This is not what the PST predicts, however.

The PST acknowledges one intervening factor affecting the rate of energy inpouring. Like any inpouring process, obstructions change the direction and ultimate speed of the inpouring. When the inpouring of probability field is obstructed, the local inpouring is slowed. This directly translates to meaning the local rate of expansion is slower. Such obstructions to the incoming field are the obvious: mass, radiation, and high energy clumps in our own universeʼs probability field.

The PST therefore predicts different expansion rates (measures of the Hubble “constant”) depending on the degree to which mass and high energy portions of the field are involved in the measurement of that rate. This is in fact what we observe [15]. Measurements involving large bodies of mass (galaxies and galaxy clusters) show slower expansion rates than measurements involving much smaller amounts of mass (e.g., a star or two).

5.2.3.1. Expansion in the early universe: In the early universe, the effect of “obstructions” was very pronounced. The universe was a lot denser than it is today, though with radiation as opposed to matter. The probability field clearly involved far more energy per volume than it does today. That early radiation and probability field density served to obstruct the inpouring of new field. That is the PSTʼs explanation for why the rate of expansion in the early universe slowed during the initial few billion years. The inpouring of new probability field was slowed due to substantial obstruction and was actively counteracted by the universe being what it was at the time (both in its density and gravity).

As the density of the universe decreased over the initial billions of years, however, the inpouring (on average) had less interfering with it and flowed more easily. The resultant expansion decreased the amount of resistance (i.e., relative volume of obstructions compared to lack of obstructions) even more, such that the rate of expansion of the universe increased over time after the initial slowing period.

5.2.3.2. The long-term future of the universe: There is nothing known in this universe for this increase in expansion ever to stop. Unless something occurs in the surrounding multiverse/universe to stop the flow of new spacetime, the future of our universe is an ever-increasing expansion. This is not the same thing, however, as has been described as the “big rip”. The PST does not predict, as do some dark energy theorists, that the expansion of the universe will eventually tear apart everything that exists including particles, quarks, etc. The PST just says that the metric expansion will continue forever, or at least until a multiverse being “turns off its faucet”.

5.2.4. Summary to this section: The concept of dark energy should be discarded. It does nothing to explain the expansion of the universe, but more importantly it ignores the concomitant process of the ongoing creation of new spacetime energy. When we explain the expansion of the universe (including its different rates across the eons and in different locales) by starting with the question from where the additional energy comes, both the source of the additional energy and the process of expansion can be explicated without the need to hypothesize something beyond spacetime itself.

5.3. The Hubble constant is not constant, but is predictable: Much has already been stated above concerning the expansion rate of the universe. As indicated, the PST specifically predicts that the expansion rate of the universe varies depending on the proximity of mass, radiation, and clumps of high energy probability field to where the rate is being measured. The rate is expected to be slower near large bodies of mass and will be measured as fastest in large voids [15].

A very large long-term study found such results. The latest findings from the Sloan Digital Sky Surveyʼs (SPSSʼs) two-decade mapping effort were announced during July 2020 [45]. Those results included a measurement of the expansion rate of the universe based on galaxy cluster patterns (due to baryon acoustic oscillations) across a large set of galaxy clusters (involving 4 million galaxies), a method that had not been used previously. Based on the involvement of the huge amount of mass in the assessment technique, the PST would have anticipated that the SPSS results would show a significantly slower rate of expansion compared to measurements involving a star or two. Such a prediction would have been accurate.

As summarized elsewhere, the expansion rate determined using single stars and binary star systems (as standard candles for measurement purposes, a methodology which shows an expansion rate of about 73 km/s/Mpc) is approximately 10% greater than both the SPSS result (of about 67 km/s/Mpc) and the result stemming from the use of the cosmic microwave background [15]. The latter measurements involve huge amounts of mass compared to one or two stars. That a study such as the SPSS would find the slower rate was predictable based on a publication referencing the PST one month prior to the announced SPSS results [15].

The significant difference in empirical estimates of the Hubble constant, fully acknowledged and even given a moniker (“Hubble tension”), has not been explained by the researchers involved. In contrast, the PST sees no tension at all, as the PST does not presume that the expansion rate of the universe has been and still is a constant everywhere in the universe. The expansion rate of the universe follows a predictable pattern but is not the same everywhere. And the PST offers an explanation as to why: the amount of obstruction to the inpouring of new probability field energy affects the local rate of expansion.

5.4. Explaining supermassive black holes in the very early universe: As described above, there is no currently accepted understanding as to how supermassive black holes (SMBHs) came to exist in the early universe, though we have observed such SMBHs. The PST offers a new explanation by starting with the rejection of a common assumption.

A black hole is simply a gravity well where that gravity is strong enough to have an apparent horizon. We assign a mass to the size of a black hole to describe the comparable degree of mass needed to account for its gravity. Problems develop, however, when we presume this comparability means that same degree of mass was needed to form the well. Accounting for the presumed very early large amounts of mass is where other theories break down in explaining early universe SMBHs.

From the perspective of the PST, no such presumption is made. Instead, the formation of very early SMBHs is thought to go back to the very beginning of the probability field itself. The big bang did not result in a completely smooth probability field. To the contrary, the cosmic microwave background shows variations in the field. A “lack of smoothness” in the field is the same thing as saying there were some clumps (volumes of higher energy) in the field. Clumps in the field existed literally from the beginning of the universe. These early clumps, these primordial volumes of higher energy (compared to background) were like all such clumps, volumes of the field that acted like gravity (i.e., they were gravity wells). They required no mass to form but nevertheless were the seeds of the first black holes in the universe. Primordial black holes came into existence essentially at the time of the big bang.

Inflation then expanded these clumps of the field, these gravity wells, along with everything else. What started incredibly small expanded dramatically at the rate of inflation. Primordial black holes got a huge boost in growth prior to there even being any nebulae or stars to collapse. After inflation, their growth continued by gorging on the highly energetic radiation all around. Then, after mass became prevalent in the still very dense universe, some of these black holes were able to continue to gorge themselves and collide with other black holes to become supermassive. Their formation virtually at the beginning of time and their existence during the period of inflation made becoming a SMBH in the early universe possible.

Again, based on the principles of the PST, the probability field itself is seen as the etiological source for a cosmological phenomenon. No as-yet-undetected particles or forces are needed for this explanation.

5.5. Filaments have angular momentum: A recent discovery that filaments have angular momentum [23] seems to demonstrate that essentially all massive objects in the universe rotate. There is no accepted theory as to why this is true.

The PST offers the following explanation. As stated in the explanation above concerning the expansion of the universe without “dark energy”, there has been an ongoing inpouring of new probability field to the universe literally since its beginning. That inpouring is everywhere but is affected in its rate by mass (and radiation and high energy field) in the local environment. The relative interference of huge bodies of mass causes the inpouring to swirl over and around that mass. Since no significant body of mass is perfectly smooth, there is always some degree of unevenness in the pressure exerted by the inpouring energy to the mass. That pressure is rarely perfectly tangential to the mass. So, the mass is pressured to rotate.

Likewise, multiple bodies of mass in a shared gravitational field will experience that rotational pressure together, at least after billions of years reacting to that persistent pressure. That mechanism is theorized to be true of a planet with its moon, protoplanets around a star, galaxies in a galaxy cluster, etc. The filaments that make up the largest known structures in the universe are no different. Each separate set of massive elements of the cosmic web, each filament and its set of galaxies and galaxy clusters, spins in keeping with how the new inpouring energy mandates it do so.

In summary, the reason all significant mass in the universe rotates is the same reason that the universe is expanding: the inpouring of new probability field (energy) generates both.

5.6. Magnetism is everywhere: The PST states that magnetism is everywhere there is spacetime. This view of magnetism is supported by the overlap between Maxwellʼs equations and Einsteinʼs general theory of relativity [25]. Beyond that mathematical determination, the PST offers a description of the mechanism that causes the derivative field of electromagnetism from all spacetime (i.e., the swirling of probability field energy). This theorized mechanism is different from any hypothesis describing the universeʼs magnetism as solely remnant from the big bang [11][46] and not generated on an ongoing basis.

The electricity from the derived electromagnetic field also plays a significant role in the development of large bodies of mass. As planetary systems form, grains of dust orbiting a star bump into each other, initially sticking together. As the clumps start to grow, compact, and become harder, however, there would be a tendency to bounce off each other. Instead, an electric charging between the clumps of matter [47][48] can cause them to adhere to each other. The cited researchers attribute that attraction to static electricity stemming from the collision and rubbing together of the matter, which can be true. The PST adds the facilitating effect of the electromagnetic field as contributing to the cohesion of the matter. The electromagnetism derived from the probability field plays a significant role in building clumps of matter that eventually become planetary systems out of dust.

5.7. Summary of this section: In the section just completed concerning the explanatory power of the PST, various problems with theoretical constructs and unexplained phenomena were addressed. The PST consistently offered descriptions of those phenomena and their underlying mechanisms using just the theoryʼs five principles. Hence, it was shown to be a very useful and yet parsimonious theory.

6. Utility of this theory: Predictions for empirical test

The following offers predictions from the PST that seem testable. The purpose of this section is to demonstrate the potential utility of the theory in furthering our knowledge by generating predictions that differentiate the theory from other theoretical formulations.

1. The Hubble constant varies based on the degree to which mass is involved in the measurement. As described above, this formulation was first published in 2020 [15] and since predicted one very major empirical finding [45]. There are other current theoretical explanations for the two persistently different figures for the expansion rate of the universe (i.e., for the “Hubble tension”): (a) weak magnetic fields account for the difference: the magnetism clumps protons and electrons into hydrogen such that light coming from the clumps starts closer than we have previously assessed, with that change in distance resolving the Hubble tension [49] or (b) more accurate measurements will resolve the apparent discrepancy [50]. The PST clearly predicts something different from those other explanations: that the expansion rate of the universe varies over a range depending on the proximity and volume of mass, radiation, and clumps of probability field (otherwise described as “dark matter”) in the measurement. This prediction is stated in a manner that is clearly testable.

2. The PST explains the finding that filaments rotate based on the inpouring of new probability field (energy) from all locations that then moves around the mass, bringing pressure to rotate. Computer simulations could be run to see how filaments would move under the theorized mechanism, with simulation results compared to the observational findings from filaments themselves.

3. The PST posits that electromagnetism exists everywhere there is spacetime. Given sufficient technology, no exception should be found. Additionally, the PST would be consistent with any finding that magnetism is a significant factor in forming a cosmological structure, as was indicated by the discovery that magnetism played a significant role in forming filaments [11].

4. “Dark matter” is typically presumed to involve some type of particle. A collection of such particles, and hence any large grouping of “dark matter” would therefore have edges that are distinct from spacetime, edges as distinct as is true for ordinary matter. The PSTʼs view of volumes of high energy probability field differs. Its edges would be diffuse. This is because its connection to the surrounding less energetic field only involves direct contact between different energy levels within the probability field (spacetime). Studies of the clarity versus diffuseness of edges where “dark matter” is said to exist would help differentiate the PST from dark matter theories.

5. As described above, when two galaxies collide, the halo “dark matter” moves beyond the galaxies in a way that indicates momentum. According to the PST, if mass is exposed directly to the baseline energy of the probability field, mass will dissipate back into the field to some degree (analogously to how ice melts when exposed to warmed air). This means that any time a galaxyʼs “dark matter halo” travels beyond that galaxyʼs mass enough to leave it without cover, that exposure will result in a decrease in the galaxyʼs mass. The technology to measure this decrease likely does not yet exist, but the prediction clearly differentiates the PST from any dark matter theory and appears unique in cosmological theory.

The above predictions do not represent a comprehensive list stemming from the PST. They do instead exemplify the predictive utility of the theory from a research perspective.

7. Improvements over existing theories and theoretical constructs

This article began with a brief critique of two existing mainstream cosmological theories, delineating their major shortcomings. In this section, the improvements by the PST compared to each existing theory are delineated in greater detail now that the PST has been described for the reader. In both cases, the PST is clearly shown to present a simpler and hence more parsimonious theory of the universe than other theories while still maintaining empirical support and offering explanations of a greater number of cosmological phenomena.

Similarly, the theoretical constructs of dark matter and dark energy were critiqued earlier in this article. In this section, the direct comparison to the relevant portions of the PST is made demonstrating the greater explanatory and predictive power of the PST over the “dark” constructs.

7.1. As compared to string or M-brane theory: Both string and M-brane theories require a set of yet-to-be-discovered particles reflecting the concept of supersymmetry (a specific relationship between bosons and fermions). String theories also require a transmission particle for gravity (named the graviton). None of these things has empirical support. Both types of theories posit the existence of many dimensions beyond our usual 3D plus time (typically 11 dimensions), all those beyond the usual four being “too small” to be detected and therefore of course have yet to be. These theories have been applied to problems such as black hole physics and early universe cosmology but have mostly been successful for developments in pure mathematics. Their promise has been great as candidates for a “theory of everything” given their unified description of gravity and particle physics. However, their lack of empirical support and their inability to narrow their scope to single “choices” among possible details makes the utility of these theories questionable.

In contrast, the PST involves no yet-to-be-discovered particles. Instead, the PST incorporates the incredibly well supported theory of general relativity (in its teleparallel equivalent form). The “added” gravity from high energy clusters of spacetime is in keeping with the concept of a cosmological constant and offers an alternative interpretation of all research supportive to the existence of dark matter. In contrast to a postulated 11-dimension universe, the PST view of our universe involves just the observable 4D (including time) we experience every day. And, contrary to the empirically yet-to-be-supported metric tensor of strings or branes, the probability (being a type of energy) reflects the well accepted idea that spacetime is full of energy enough to have a “foam” of virtual particles. Overall, the PST relies on fewer new constructs and is more congruous with established concepts.

Despite its newness, the PST offers explanations for numerous cosmological phenomena that string and M-brane theories have yet to address such as: (a) the nature and mechanism for what has typically been termed “dark matter”, (b) the reason for and mechanism underlying filament angular momentum, (c) the development and mechanism for universal magnetism, and (d) why the Hubble constant varies. Explanations for these phenomena are all well beyond the current state of string and M-brane theories and yet are addressed by the PST using fewer assumptions and already with some empirical support.

7.2. As compared to quantum loop gravity theory: As stated in the initial critique of quantum loop gravity (QLG) theory above, the theory was not devised or promoted as a complete theory of universal phenomena. It is a theory of quantum gravity whose purpose is to bridge the gap between quantum mechanics and general relativity. The fact it is totally based on a hypothesized structure to spacetime makes it the only major cosmological theory that views spacetime as the foundation of all that exists.

The structure of spacetime is seen as solely having a quantum (point-based) structure. Both the QLG and the PST have found utility in adopting an added component to the usual assessment of gravity. (The QLG does this directly by using a cosmological constant while the PST does this indirectly, using relatively high energy clumps of spacetime as the added factor.) Mathematical work has extended the QLG theory to explain cosmic inflation [51], ridding of gravitational singularities from our understanding of black holes [52] (by seeing black holes as quantum bridges, otherwise known as wormholes), and related areas but it remains a limited theory whose relationship to most cosmological phenomena is still unknown despite its first derivation being more than 30 years ago [53].

In comparison, the PST also starts with an aspect of spacetime as the metric tensor, but quite specifically represented by both quantum structure and wave functions. The effect of the wave function aspect to spacetime is the sharing of energy across probabilities which then results in (a) universal magnetism and (b) higher versus lower density volumes of spacetime which then result in (c) the development of gauge bosons, mass, and clumps of relatively high energy spacetime that acts in keeping with what has been termed dark matter. The QLG theory does not address those things.

fers straightforward explanations. The QLG theory rids of gravitational singularities by instead viewing black holes as pathways to other universes. The QLG says there is no bottom to a black hole – just another opening somewhere other than our universe. The PST denies the existence of singularities far more simply: relying on Newtonʼs first law of thermodynamics. Spacetime is energy, and energy cannot be destroyed. The inside of a black hole necessarily includes spacetime which the gravity of the black hole crushes as far as spacetime can be crushed, but never to the point of zero energy. There must be a “bottom” to every back hole consisting of spacetime (though clearly in an exotic phase).

By hypothesizing that black holes are quantum bridges to other universes, the QLG theory shows a similarity with the PST in positing the existence of at least one other universe beyond ours. The PST found the inpouring of energy from another universe (more accurately stated, from outside our universe) to be explanative for two very different phenomena: (a) the expansion of the universe, and (b) the angular momentum of filaments (and all other large bodies). Given the shared idea that something outside our universe exists, the PST uses that idea to explain more than does the QLG theory.

Finally, as stated previously, the QLG is not designed (at least yet) to be a comprehensive explanation of cosmological phenomena. Issues such as dark matter, the Hubble tension, how supermassive black holes could develop in the very early universe, and from where magnetism in the intergalactic space is derived are all beyond current QLG theory. The listed phenomena were all addressed by the PST, showing the greater comprehensiveness of the PST.

7.3. As compared to the theoretical constructs of dark matter and dark energy: The construct of dark matter is used to explain the unknown source for the additional gravity needed to explain various astronomical observations. Numerous particles (e.g., WIMPS, axions) and cosmological entities (such as small black holes) have been hypothesized as the source of that required additional gravity, but none of these has been supported empirically despite years of efforts.

The PST explains the “extra” gravity without hypothesizing any new particle. The source is simply higher energy portions of spacetime. Additionally, the PST offers a description of the mechanism by which “dark matter” halos are formed and generally maintained around large bodies of mass, and even how momentum within spacetime (i.e., gravityʼs momentum, an idea borrowed from general relativity) explains the astronomical observation that some large clumps of “dark matter” have been found independent from all large mass bodies.

The concept of dark energy is a placeholder, suggesting a repellent energy source, while offering no explanation of explanatory entity or mechanism for the well-established fact of universal expansion. In contrast to such a placeholder, the PST describes a mechanism to explain universal expansion (the inpouring of spacetime from outside the universe) and does so without hypothesizing any new particle or force.

8. Overall summary

Numerous shortcomings to current cosmological theories and theoretical constructs are known to exist and were delineated herein. A new cosmological theory, the probabilistic spacetime theory (PST) was then described that addressed those shortcomings as well as incorporated recent empirical discoveries.

The PST is a simpler picture of the universe than is suggested by most current cosmological theories. Spacetime, reconceptualized as the probability field, is not a container of the dynamic entities of the universe but is the essence of all things. Multiple fields are derivative to the probability field, as are the various elements of the Standard Model.

The theory completely avoided the nebulous constructs of dark energy, singularities, and gravitons. The consistent research failure to find the elements of dark matter was explained by offering a far different description of the phenomena that dark matter was hypothesized to explain. A resolution to the “Hubble tension” was not only offered, but the underlying mechanism was explicated. The same was true for primordial supermassive black holes and the magnetism surrounding cosmological filaments.

Numerous features within the PST have already found support. That support has been mathematical, observational, and experimental.

Mathematical support was found concerning multiple aspects of the PST:

1. Spacetime as an active field, with the model involving “fragments of energy”, was shown sufficient to account for the precession of Mercury and the bending of light we call gravitational lensing; this being supportive to the PSTʼs fundamental concept of probabilities.

2. Spacetime is inseparable from energy-momentum, this being supportive to the PST concept of spacetime consisting of swirling wave functions.

3. Magnetism exists everywhere spacetime exists, as was shown by the finding Maxwellʼs electromagnetism equations are related to Einsteinʼs general relativity; this being supportive to the PSTʼs assertion of magnetism being a necessary derivative of spacetime everywhere there is spacetime.

4. Spacetime can involve the generation of quantum mass, but only when sufficient energy is available, this being supportive to the PSTʼs concept of phases of spacetime.

Unexpected observations exist that are supportive to four other components of the PST:

1. Consistently different rates of expansion related to the degree mass was involved in their measurements, despite not predicted by other theories.

2. Probability field halos demonstrate momentum and the ability to remain intact when separated from their originally surrounded mass.

3. Early universe supermassive black holes exist without being dependent on developmental mechanisms lacking empirical support.

4. What has been termed “dark matter” is interactive with itself.

The PST also incorporated consistent empirical findings not included in other cosmological theories:

1. Charge and spin can exist independently from the particles to which we usually see them attached.

2. Glueballs exist.

The PST offers more than just re-conceptualizations and explanations of various cosmological phenomena. The theory also offers unique predictions for what future research should find to be true. The PST therefore entails all that is required of a useful theory: empirically supported explanations of phenomena and testable predictions.

The authors extend one final summation of the essence of the probabilistic spacetime theory. We are not just stardust. More fundamentally, we are all spacetime.

Acknowledgements

The authors declare that they have no competing interests.

References

  1. ATLAS, CERN (2021). ATLAS Supersymmetry Public Results. Retrieved 2021-01-08.   
  2. CMS, CERN (2021). CMS Supersymmetry Public Results. Retrieved 2021-01-08.   
  3. Wolchover, N. The best explanation for everything in the universe. The Atlantic. 2017. Retrieved 2021-01-08.   
  4. Muxin, H. Cosmological constant in loop quantum gravity vertex amplitude. Phys Rev D. 2011; 84(6): 064010.   
  5. Fairbairn, W. J. & Meusburger, C. q-Deformation of Lorentzian spin foam models. General Relativity and Quantum Cosmology. 2011.   
  6. Bucklin, S. M. A history of dark matter. ARS Technica. 2017; Retrieved 30 July 2021.   
  7. Copi, C.J., Schramm, D.N., Turner, M.S. Big-bang nucleosynthesis and the baryon density of the universe. Science. 1995; 267(5195): 192–199. doi: 10.1126/science.7809624   
  8. Di Luzio L., Nardi E., Giannotti M. et al. The landscape of QCD axion models. Physics Reports. 2020; 870: 1–117   
  9. Gibney E. Last chance for WIMPs: Physicists launch all-out hunt for dark-matter candidate. Nature. 2020; 586: 344-345. doi: 10.1038/d41586-020-02741-3   
  10. Bertone G., Hooper D., Silk, J. Particle dark matter: Evidence, candidates and constraints. Physics Reports. 2005; 405(5–6): 279–390. doi: 10.1016/j.physrep.2004.08.031   
  11. Vernstrom T., Heald G., Vazza F., et al. Discovery of magnetic fields along stacked cosmic filaments as revealed by radio and x-ray emission. MNARS. 2021; 505(3): 4178–4196. doi: 10.1093/mnras/stab1301   
  12. Martens N. C. M., Lehmkuhl, D. Dark matter = modified gravity? Scrutinising the spacetime-matter distinction through the modified gravity/dark matter lens. Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics. 2020; 72: 237-250. doi: 10.1016/j.shpsb.2020.08.003   
  13. Berezhiani L., Khoury J. Theory of dark matter superfluidity. Phys Rev D. 2015; 92(10): 101103. doi: 10.1103/PhysRevD.92.103510   
  14. Doren D. M., Harasymiw J. Resolving the Hubble constant discrepancy: Revisiting the effect of local environments. Int J Cosmol Astron Astrophys. 2020; 2(1): 94-96. doi: 10.18689/ijcaa-1000121   
  15. Yang J., Wang, F., Fan X. et al. Poniua’ena: A luminous z = 7.5 quasar hosting a 1.5 billion solar mass black hole. Astrophys J Lett. 2020; 897(1): L14. doi: 10.3847/2041-8213/ab9c26   
  16. Wang F., Yang J., Fan X. et al. A Luminous Quasar at redshift 7.642. Astrophys J Lett. 2021; 907(1). doi: 10.3847/2041-8213/abd8c6   
  17. Feng W., Yu Zhong. Seeding supermassive black holes with selfinteracting dark matter: A unified scenario with baryons. Astrophys J Lett. 2021; 914(2): L26. doi: 10.3847/2041-8213/ac04b0   
  18. Agarwal B., Dalla Vecchia C., Johnson J. L.et al. The first billion years project: Birthplaces of direct collapse black holes. Monthly Notices of the Royal Astronomical Society. 2014; 443(1): 648–657. doi: 10.1093/mnras/stu1112   
  19. Habouzit, M., Volonteri, M., Latif, M.et al. On the number density of ‘direct collapse’ black hole seeds. Monthly Notices of the Royal Astronomical Society. 463(1): 529–540. doi: 10.1093/mnras/stw1924   
  20. Pacucci F., Ferrara A., Grazian A. et al. First identification of direct collapse black hole candidates in the early universe in CANDELS/GOODS-S. Monthly Notices of the Royal Astronomical Society. 2016; 459(2): 1432–1439. doi: 10.1093/mnras/stw725   
  21. Basu S., Das A. The mass function of supermassive black holes in the direct-collapse scenario. Astrophys J Lett. 2019; 879(1): L3. doi: 10.3847/2041-8213/ab2646   
  22. Wang P., Libeskind N. I., Tempe E., et al. Possible observational evidence for cosmic filament spin. Nat Astron. 2021; 5: 839-845. doi: 10.1038/s41550-021-01380-6   
  23. Govoni F., Orrù E., Bonafede A., et al. A radio ridge connecting two galaxy clusters in a filament of the cosmic web. Science. 2019; 364(6444): 981-984. doi: 10.1126/science.aat7500   
  24. Lindgren J., Liukkonen J. Maxwell’s equations from spacetime geometry and the role of Weyl curvature. J Phys: Conference Series. 2021; 1956 012017. doi: 10.1088/1742-6596/1956/1/012017   
  25. Harrison E. H. Magnetic Fields in the Early Universe. Monthly Notices of the Royal Astronomical Society. 1973; 165(2): 185-200. doi: 10.1093/mnras/165.2.185   
  26. Czajka P., Gao T., Hirschberger M. et al. Oscillations of the thermal conductivity in the spin-liquid state of α-RuCl3. Nature Phys. 2021; 17: 915-919.   
  27. Kim B., Koh H., Rotenberg E. et al. Distinct spinon and holon dispersions in photoemission spectral functions from onedimensional SrCuO2. Nature Phys. 2006; 2: 397–401.   
  28. Ruan W., Chen Y., Tang S. et al. Evidence for quantum spin liquid behaviour in a single-layer 1T-TaSe2 from scanning tunnelling microscopy. Nature Phys. 2021; 17: 1154–1161.   
  29. Yoon T. H., Cho M. Quantitative complementarity ofwave-particleduality. Science Advances. 2021; 7(34). doi: 10.1126/sciadv.abi9268   
  30. Yau, H. Y. Quantum theory from a space-time wave. arxiv: 0706.0190. 2007.   
  31. Wheeler J. A., Ford K. W. Geons, black holes, and quantum foam: A life in physics. W. W. Norton & Company, New York. 1998.   
  32. Silverberg, L.M. & Eischen, J.W. On a new field theory formulation and a space-time adjustment that predict the same precession of Mercury and the same bending of light as general relativity. Physics Essays. 2020; 33(4): 489-512. doi: 10.4006/0836-1398-33.4.489   
  33. Brünner F., Rebhan A. Nonchiral enhancement of scalar glueball decay in the Witten-Sakai-Sugimoto model. Phys Rev Lett. 2015; 115(13). doi: 10.1103/PhysRevLett.115.131601   
  34. Combi L., Romero G.E. Is teleparallel gravity really equivalent to general relativity. Annalen der Physik. 2017; 530(1). doi: 10.1002/andp.201700175   
  35. Aygün S., Baysal H., Aktas C. et al. Teleparallel energy-momentum distribution of various black hole and wormhole metrics. Int J Mod Phy A. 2018; 33(30): 1850184. doi: 10.1142/S0217751X18501841   
  36. Lehmkuhl D. Mass-energy-momentum: Only there because of spacetime? Brit J Phil Sci. 2011; 62(3): 453-488.   
  37. Migdal A.B. Superfluidity and the moments of inertia of nuclei. Nucl Phys. 1959; 13(5): 655-674. doi: 10.1016/0029-5582(59)90264-0   
  38. Massey R., Williams L., Smit R. The behaviour of dark matter associated with four bright cluster galaxies in the 10 kpc core of Abell 3827. Monthly Notices of the Royal Astronomical Society. 2015; 449(4): 3393–3406. doi: 10.1093/mnras/stv467   
  39. Hupp E., Roy S., Watzke M. NASA finds direct proof of dark matter. NASA press release. 2006.   
  40. Jenner L., Dunbar B. Dark matter core defies explanation. NASA press release. 2012.   
  41. Harvey D., Massey R., Kitching T. et al. The non-gravitational interactions of dark matter in colliding galaxy clusters. Science. 2015; 347(6229): 1462-1465. doi: 10.1126/science.1261381   
  42. Wikipedia; Observable universe. Retrieved 27 Aug 2021.   
  43. Dawson K., Percival W. No need to Mind the Gap: Astrophysicists fill in 11 billion years of our universe’s expansion history. Sloan Digital Sky Survey Press Release. 2020.   
  44. Schlickeiser R. Cosmic magnetization: From spontaneously emitted aperiodic turbulent to ordered equipartition fields. Phys Rev Lett. 2012; 109(26). doi: 10.1103/PhysRevLett.109.261101   
  45. Steinpilz T., Joeris K., Jungmann F. et al. Electrical charging overcomes the bouncing barrier in planet formation. Nat Phys. 2020; 16: 225–229.   
  46. Steinpilz T., Musiolik G., Kruss M. et al. ARISE: A granular matter experiment on the International Space Station. Rev Sci Instrum. 2019; 90, 104503. doi: 10.1063/1.5095213   
  47. Jedamzik K., Pogosian L. Relieving the Hubble tension with primordial magnetic fields. Phys Rev Lett. 2020; 125, 181302. doi: 10.1103/PhysRevLett.125.181302   
  48. Freedman W. Measurements of the Hubble constant: Tensions in perspective. Astrophys J. 2021; 919(1). doi: 10.3847/1538-4357/ac0e95   
  49. Bhardwaj A., Copeland E. J., Louko J. Inflation in loop quantum cosmology. Phys Rev D. 2019; 99(10): 1103.   
  50. Ashtekar A. Singularity resolution in loop quantum cosmology: A brief overview. J Phys Conference Series. 2009; 189, 012003.   
  51. Rovelli C., Smolin L. Loop space representation of quantum general relativity. Nucl Phys. B. 1990; 331(1): 80-152. doi: 10.1016/0550-3213(90)90019-A