I had written most of the post below the line before an exchange with a senior colleague who accused me of asking us to abandon General Relativity (GR). Anyone who read the last post knows that this is the opposite of true. So how does this happen?
Much of the field is mired in bad ideas that seemed like good ideas in the 1980s. There has been some progress, but the idea that MOND is an abandonment of GR I recognize as a misconception from that time. It arose because the initial MOND hypothesis suggested modifying the law of inertia without showing a clear path to how this might be consistent with GR. GR was built on the Equivalence Principle (EP), the equivalence1 of gravitational charge with inertial mass. The original MOND hypothesis directly contradicted that, so it was a fair concern in 1983. It was not by 19842. I was still an undergraduate then, so I don’t know the sociology, but I get the impression that most of the community wrote MOND off at this point and never gave it further thought.
I guess this is why I still encounter people with this attitude, that someone is trying to rob them of GR. It’s feels like we’re always starting at square one, like there has been zero progress in forty years. I hope it isn’t that bad, but I admit my patience is wearing thin.

What MOND does ask us to abandon is the Strong Equivalence Principle. Not the Weak EP, nor even the Einstein EP. Just the Strong EP. That’s a much more limited ask that abandoning all of GR. Indeed, all flavors of EP are subject to experimental test. The Weak EP has been repeatedly validated, but there is nothing about MOND that implies platinum would fall differently from titanium. Experimental tests of the Strong EP are less favorable.
I understand that MOND seems impossible. It also keeps having its predictions come true. This combination is what makes it important. The history of science is chock full of ideas that were initially rejected as impossible or absurd, going all the way back to heliocentrism. The greater the cognitive dissonance, the more important the result.
Continuing the previous discussion of UT, where do we go from here? If we accept that maybe we have all these problems in cosmology because we’re piling on auxiliary hypotheses to continue to be able to approximate UT with FLRW, what now?
I don’t know.
It’s hard to accept that we don’t understand something we thought we understood. Scientists hate revisiting issues that seem settled. Feels like a waste of time. It also feels like a waste of time continuing to add epicycles to a zombie theory, be it LCDM or MOND or the phoenix universe or tired light or whatever fantasy reality you favor. So, painful as it may be, one has find a little humility to step back and take account of what we know empirically independent of the interpretive veneer of theory.
As I’ve said before, I think we do know that the universe is expanding and passed through an early hot phase that bequeathed us the primordial abundances of the light elements (BBN) and the relic radiation field that we observe as the cosmic microwave background (CMB). There’s a lot more to it than that, and I’m not going to attempt to recite it all here.
Still, to give one pertinent example, BBN only works if the expansion rate is as expected during the epoch of radiation domination. So whatever is going on has to converge to that early on. This is hardly surprising for UT since it was stipulated to contain GR in the relevant limit, but we don’t actually know how it does so until we work out what UT is – a tall order that we can’t expect to accomplish overnight, or even over the course of many decades without a critical mass of scientists thinking about it (and not being vilified by other scientists for doing so).
Another example is that the cosmological principle – that the universe is homogeneous and isotropic – is observed to be true in the CMB. The temperature is the same all over the sky to one part in 100,000. That’s isotropy. The temperature is tightly coupled to the density, so if the temperature is the same everywhere, so is the density. That’s homogeneity. So both of the assumptions made by the cosmological principle are corroborated by observations of the CMB.
The cosmological principle is extremely useful for solving the equations of GR as applied to the whole universe. If the universe has a uniform density on average, then the solution is straightforward (though it is rather tedious to work through to the Friedmann equation). If the universe is not homogeneous and isotropic, then it becomes a nightmare to solve the equations. One needs to know where everything was for all of time.
Starting from the uniform condition of the CMB, it is straightforward to show that the assumption of homogeneity and isotropy should persist on large scales up to the present day. “Small” things like galaxies go nonlinear and collapse, but huge volumes containing billions of galaxies should remain in the linear regime and these small-scale variations average out. One cubic Gigaparsec will have the same average density as the next as the next, so the cosmological principle continues to hold today.
Anyone spot the rub? I said homogeneity and isotropy should persist. This statement assumes GR. Perhaps it doesn’t hold in UT?
This aspect of cosmology is so deeply embedded in everything that we do in the field that it was only recently that I realized it might not hold absolutely – and I’ve been actively contemplating such a possibility for a long time. Shouldn’t have taken me so long. Felten (1984) realized right away that a MONDian universe would depart from isotropy by late times. I read that paper long ago but didn’t grasp the significance of that statement. I did absorb that in the absence of a cosmological constant (which no one believed in at the time), the universe would inevitably recollapse, regardless of what the density was. This seems like an elegant solution to the flatness/coincidence problem that obsessed cosmologists at the time. There is no special value of the mass density that provides an over/under line demarcating eternal expansion from eventual recollapse, so there is no coincidence problem. All naive MOND cosmologies share the same ultimate fate, so it doesn’t matter what we observe for the mass density.
MOND departs from isotropy for the same reason it forms structure fast: it is inherently non-linear. As well as predicting that big galaxies would form by z=10, Sanders (1998) correctly anticipated the size of the largest structures collapsing today (things like the local supercluster Laniakea) and the scale of homogeneity (a few hundred Mpc if there is a cosmological constant). Pretty much everyone who looked into it came to similar conclusions.
But MOND and cosmology, as we know it in the absence of UT, are incompatible. Where LCDM encompasses both cosmology and the dynamics of bound systems (dark matter halos3), MOND addresses the dynamics of low acceleration systems (the most common examples being individual galaxies) but says nothing about cosmology. So how do we proceed?
For starters, we have to admit our ignorance. From there, one has to assume some expanding background – that much is well established – and ask what happens to particles responding to a MONDian force-law in this background, starting from the very nearly uniform initial condition indicated by the CMB. From that simple starting point, it turns out one can get a long way without knowing the details of the cosmic expansion history or the metric that so obsess cosmologists. These are interesting things, to be sure, but they are aspects of UT we don’t know and can manage without to some finite extent.
For one, the thermal history of the universe is pretty much the same with or without dark matter, with or without a cosmological constant. Without dark matter, structure can’t get going until after thermal decoupling (when the matter is free to diverge thermally from the temperature of the background radiation). After that happens, around z = 200, the baryons suddenly find themselves in the low acceleration regime, newly free to respond to the nonlinear force of MOND, and structure starts forming fast, with the consequences previously elaborated.
But what about the expansion history? The geometry? The big questions of cosmology?
Again, I don’t know. MOND is a dynamical theory that extends Newton. It doesn’t address these questions. Hence the need for UT.
I’ve encountered people who refuse to acknowledge4 that MOND gets predictions like z=10 galaxies right without a proper theory for cosmology. That attitude puts the cart before the horse. One doesn’t look for UT unless well motivated. That one is able to correctly predict 25 years in advance something that comes as a huge surprise to cosmologists today is the motivation. Indeed, the degree of surprise and the longevity of the prediction amplify the motivation: if this doesn’t get your attention, what possibly could?
There is no guarantee that our first attempt at UT (or our second or third or fourth) will work out. It is possible that in the search for UT, one comes up with a theory that fails to do what was successfully predicted by the more primitive theory. That just lets you know you’ve taken a wrong turn. It does not mean that a correct UT doesn’t exist, or that the initial prediction was some impossible fluke.
One candidate theory for UT is bimetric MOND. This appears to justify the assumptions made by Sanders’s early work, and provide a basis for a relativistic theory that leads to rapid structure formation. Whether it can also fit the acoustic power spectrum of the CMB as well as LCDM and AeST has yet to be seen. These things take time and effort. What they really need is a critical mass of people working on the problem – a community that enjoys the support of other scientists and funding institutions like NSF. Until we have that5, progress will remain grudgingly slow.
1The equivalence of gravitational charge and inertial mass means that the m in F=GMm/d2 is identically the same as the m in F=ma. Modified gravity changes the former; modified inertia the latter.
2Bekenstein & Milgrom (1984) showed how a modification of Newtonian gravity could avoid the non-conservation issues suffered by the original hypothesis of modified inertia. They also outlined a path towards a generally covariant theory that Bekenstein pursued for the rest of his life. That he never managed to obtain a completely satisfactory version is often cited as evidence that it can’t be done, since he was widely acknowledged as one of the smartest people in the field. One wonders why he persisted if, as these detractors would have us believe, the smart thing to do was not even try.
3The data for galaxies do not look like the dark matter halos predicted by LCDM.
4I have entirely lost patience with this attitude. If a phenomena is correctly predicted in advance in the literature, we are obliged as scientists to take it seriously+. Pretending that it is not meaningful in the absence of UT is just an avoidance strategy: an excuse to ignore inconvenient facts.
+I’ve heard eminent scientists describe MOND’s predictive ability as “magic.” This also seems like an avoidance strategy. I, for one, do not believe in magic. That it works as well as it does – that it works at all – must be telling us something about the natural world, not the supernatural.
5There does exist a large and active community of astroparticle physicists trying to come up with theories for what the dark matter could be. That’s good: that’s what needs to happen, and we should exhaust all possibilities. We should do the same for new dynamical theories.