I was on vacation last week. As soon as I got back, the first thing I did was fall off my bike onto a tree stump, breaking my wrist. I’ll be okay, but I won’t be typing a lot. This post is being dictated to software; I hope I don’t have to do too much editing. I let the software generate the image above based on the prompt “dark matter properties illustrated” and I don’t think we should hold our breath for AI to help us out with this.
There were some good questions to the last post that I didn’t get to address. I went back and tried to answer some of them. Siriusactuary asked about the properties required for dark matter for galaxies vs. large scale structure. That’s a very deep question that requires a long answer with some historical perspective. Please bear with me as I attempt a quasi-coherent, off-the-cuff narrative that doesn’t invite a lot of editing, which it surely will.
I thought about this long and hard when I first encountered the problem. Which was almost thirty years ago now. So it is probably worth a short refresher.
We have been assuming all along, I think reasonably, that cosmological dark matter and galaxy dark matter are the same stuff, just different manifestations of the same problem. Perhaps they’re not, but there is a huge range of systems that show acceleration discrepancies, and it isn’t always trivial to split them into one camp or another. It seems common to talk about large and small scale problems, but I don’t think size is the right way to think about it. It’s more a difference between gravitationally bound systems that are in equilibrium and the dynamics of the expanding universe as an evolving entity that contains structures that develop within it.
The problem in bound systems is not just galaxy dynamics. It’s also clusters of galaxies. It’s also a star clusters that don’t show a discrepancy. The problem extends over a dynamic range of at least a billion in baryonic mass. It involves all sorts of dynamical questions where we do sometimes need to invoke dark matter or MOND or whatever. The evidence in bound systems is inevitably that when we apply the law of gravity as we know it to the stuff we can see, the visible baryons, then the dynamical mass doesn’t add up. We need something extra to explain the data.
The simple answer early on was that there was simply more mass there, i.e., dark matter. But that much is ambiguous. It could be that we infer the need for dark matter because the equations are inadequate and need to be generalized, i.e., something like MOND. But to start, at the beginning of the dark matter paradigm, there was no particular restriction on what the dark matter needed to be or what its properties needed to be. It could be baryonic, it could be non-baryonic. It could be black holes, brown dwarfs, all manner of things.
From a cosmological perspective, it became apparent in the early 1980s that we needed something extra – not just dark, but non-baryonic. By this time it was easy to believe because people like Vera Rubin and Albert Bosma had already established that we needed more than meets the eyes in galaxies. So dark matter was no longer a radical hypothesis, which it had been in 1970. The paradigm kinda snowballed – it had been around as a possibility since the 1930s, but it was only in the 1970s that it became firmly established dynamically. Even then it was like a factor of two and could be normal if hard to see baryons like brown dwarfs. By the early 1980s it was clear we needed more like a factor of ten, and it had to be something new: the cosmological constraint was that the gravitating mass density is greater than the baryon density allowed by big bang nucleuosynthesis. That means that there is a requirement on the nature of dark matter beyond there just being more mass.
The cosmic dark matter has to be something non-baryonic. That is to be say, it has to be some new kind of beast, presumably some kind of a particle that is not already in the standard model of particle physics. This was received with eagerness by particle physicists who felt that their standard model was complete and yet unsatisfactory and there should be something deeper and more to it. This was an indication in that direction. From a cosmological perspective, the key fact was that there was something more out there than met the eye. Gravitation gave a mass density then was higher than allowed in normal matter. Not only did you need dark matter, but you needed some kind of novel, new particle that’s not in the standard model of particle physics to be that dark matter.
The other cosmological imperative was to grow large scale structure. The initial condition that we see in the early universe is very smooth. That is the microwave background on the sky, with its very small temperature fluctuations, only one part in a hundred thousand. That’s the growth factor reached by redshift zero: structure has grown by a factor of a hundred thousand. Normal gravity will grow structure at a rate that is proportional to the rate at which the universe expands, which is basically a factor of a thousand since the microwave background was imprinted.
So we have another big discrepancy. We can only grow structure by a factor of a thousand, but we observe that it has grown by a factor of a hundred thousand. So we need something to goose the process. That something can be dark matter, provided that it does not interact with photons directly. It can be a form of particle that does not interact via the electromagnetic force. It can interacts through gravity and perhaps through the weak nuclear force, but not through the electromagnetic force.
Those are properties that are required of dark matter by cosmology. It has to be non-baryonic and not interact through electromagnetism. These properties are not necessary for galaxies. And that’s basically the picture that persists today. One additional constraint that we need from a cosmological perspective is that the dark matter needs to be slow-moving – dynamically cold so that structure can form. If you make it dynamically hot, like neutrinos that are born moving at very nearly the speed of light, those are not going to clump up and form structure even if they have a little mass.
So that was the origin of the cold dark matter paradigm. We needed some form of completely novel particle that had the right relic density – this is where the wimp miracle comes in. That worked fine for galaxies at the time. All you needed for galaxies early on was extra mass. It was cosmology that gave us these extra indications of what the dark matter needs to be.
We’ve learned a lot more about galaxies since then. I remember in the early nineties when I was still a staunch proponent of cold dark matter being approached at conferences by eminent dynamicists who confided in hushed tones so that the cosmologists wouldn’t hear that they thought the dark matter had to be baryonic, not non-baryonic.
I had come to this from the cosmological perspective that I had just described above. The total mass density had to be a lot bigger than the baryonic mass density. Therefore the dark matter had to be non-baryonic. To say otherwise was crazy talk, which is why they were speaking about it in hushed tones. But here were these very eminent people who were very quietly suggesting to me that their work on galaxies suggested that the dark matter had to be made a baryons not something non-baryonic. I asked why, and basically it boiled down to the fact that they could see clear connections between the dynamics and the baryons. It didn’t suffice just to have extra mass; the dark and luminous component seemed to know about each other*.
The data for galaxies showed that the stuff we could see, the distribution of stars and gas, was clearly and intimately related to the total distribution of mass, including the dark matter. This led to a number of ideas, that do not sit well with the cold dark matter paradigm. One was HI scaling: basically, if you took the distribution of atomic gas, and scaled it up by a factor of roughly 10, then that was a decent predictor of what the dark matter was doing. Given that, one could imagine that maybe the dark matter was some form of unseen baryons that follow the same distribution as the atomic gas. There was even an elaborate paradigm built up around very cold molecular gas to do this. That seemed problematic for me, because if you have cold molecular gas, it should clump up and form stars, and then you see it. Even if you didn’t see it in it’s cold form you need a lot of it. Interestingly, you do not violate the BBN baryon density, just in galaxies. But you would on a cosmic scale, if that was the only form of dark matter. So then we we need multiple forms of dark matter, which violates parsimony.
Another important and frequent point is the concept of maximum disk. This came up last time in the case of NGC 1277, where the inner regions of that galaxy have its dynamics completely explained by the stars that you see. This is a very common occurrence in high surface brightness galaxies. In regions where the stars are dense, that’s all the mass that you need. It’s only when you get out to a much larger radius, where the accelerations become low, that you needed something extra, the dark matter effect.
It was pretty clear and widely accepted that the inner regions of many bright galaxies were star dominated. You did not need much dark matter in the center, only at the edges. So you had this picture of a pseudoisothermal halo with a low density central core. But by the mid-nineties, a lot of simulations all showed that cold dark matter halos should have cusps: they predicted there to be a lot of dark matter near the centers of galaxies.
This contradicted the picture that had been established. And so people got into big arguments as to whether or not high-surface brightness galaxies were indeed maximal. The people who actually worked on galaxies said Yes, we have established that they are maximal – we only need stars in the central regions; the dark matter only becomes necessary farther out. People who were coming at it from the cosmological perspective without having worked on individual galaxies saw the results of the simulations, saw that there’s always a little room to trade off between the stellar mass and the dark mass by adjusting the mass to light ratio of the stars, and said galaxies cannot be maximal.
I was perplexed by this contradiction. You had a strong line of evidence that galaxies were maximal and their centers. You had a completely different line of evidence, a top down cosmological view of galaxies that said galaxies should not and could not be maximal in nurse centers. Which of those interpretations you believe seemed to depend on which camp you came out of.
I came out of both camps. I was working on low surface brightness galaxies at the time and was hopeful that they would help to resolve the issue. Instead they made it worse, sticking us with a fine-tuning problem. I could not solve this fine-tuning problem. It caused me many headaches. It was only after I had suffered those headaches that I began to worry about the dark matter paradigm. And then by chance, I heard a talk by this guy Milgrom who, in a few lines on the board, derived as a prediction all of the things that I was finding problematic to interpret in terms of dark matter. Basically, a model with dark matter has to look like MOND to satisfy the data.
That’s just silly, isn’t it?
MOND made predictions. Those predictions came true. What am I supposed to report? That it had these predictions com true – therefore it’s wrong?
I had made my own prediction based on dark matter. It failed. Other people had different predictions based on dark matter. Those also did not come true. Milgrom was only the only one to correctly predict ahead of time what low surface brightness galaxies would do.
If we insist on dark matter, what this means is that we need, for each and every galaxy, the precise that looks like MOND. I wrote the equation for the required effects of dark matter in all generality in McGaugh (2004). The improvements in the data over the subsequent decade enable this to be abbreviated to
gDM = gbar/(e√(gbar/a0) -1).
This is in McGaugh et al. (2016), which is a well known paper (being in the top percentile of citation rates). So this should be well known, but the implication seems not to be, so let’s talk it through. gDM is the force per unit mass provided by the dark matter halo of a galaxy. This is related to the mass distribution of the dark matter – its radial density profile – through the Poisson equation. The dark matter distribution is entirely stipulated by the mass distribution of the baryons, represented here by gbar. That’s the only variable on the right hand side, a0 being Milgrom’s acceleration constant. So the distribution of what you see specifies the distribution of what you can’t.
This is not what we expect for dark matter. It’s not what naturally happens in any reasonable model, which is an NFW halo. That comes from dark matter-only simulations; it has literally nothing to do with gbar. So there is a big chasm to bridge right from the start: theory and observation are speaking different languages. Many dark matter models don’t specify gbar, let alone satisfy this constraint. Those that do only do so crudely – the baryons are hard to model. Still, dark matter is flexible; we have the freedom to make it work out to whatever distribution we need. But in the end, the best a dark matter model can hope to do is crudely mimic what MOND predicted in advance. If it doesn’t do that, it can be excluded. Even if it does do that, should we be impressed by the theory that only survives by mimicking its competitor?
The observed MONDian behavior makes no sense whatsoever in terms of the cosmological constraints in which the dark matter has to be non-baryonic and not interact directly with the baryons. The equation above implies that any dark matter must interact very closely with the baryons – a fact that is very much in the spirit of what earlier dynamicist had found, that the baryons and the dynamics are intimately connected. If you know the distribution of the baryons that you can see, you can predict what the distribution of the unseen stuff has to be.
And so that’s the property that galaxies require that is pretty much orthogonal to the cosmic requirements. There needs to be something about the nature of dark matter that always gives you MONDian behavior in galaxies. Being cold and non-interacting doesn’t do that. Instead, galaxy phenomenology suggests that there is a direct connection – some sort of direct interaction – between dark matter and baryons. That direct interaction is anathema to most ideas about dark matter, because if there’s a direct interaction between dark matter and baryons, it should be really easy to detect dark matter. They’re out there interacting all the time.
There have been a lot of half solutions. These include things like warm dark matter and self interacting dark matter and fuzzy dark matter. These are ideas that have been motivated by galaxy properties. But to my mind, they are the wrong properties. They are trying to create a central density core in the dark matter halo. That is at best a partial solution that ignores the detailed distribution that is written above. The inference of a core instead of a cusp in the dark matter profile is just a symptom. The underlying disease is that the data look like MOND.
MONDian phenomenology is a much higher standard to try to get a dark matter model to match than is a simple cored halo profile. We should be honest with ourselves that mimicking MOND is what we’re trying to achieve. Most workers do not acknowledge that, or even be aware that this is the underlying issue.

There are some ideas to try to build-in the required MONDian behavior while also satisfying the desires of cosmology. One is Blanchet’s dipole or dark matter. He imagined a polarizable dark medium that does react to the distribution of baryons so as to give the distribution of dark matter that gives MOND-like dynamics. Similarly, Khoury’s idea of superfluid dark matter does something related. It has a superfluid core in which you get MOND-like behavior. At larger scales it transitions to a non-superfluid mode, where it is just particle dark matter that reproduces the required behavior on cosmic scales.
I don’t find any of these models completely satisfactory. It’s clearly a hard thing to do. You’re trying to mash up two very different sets of requirements. With these exceptions, the galaxy-motivated requirement that there is some physical aspect of dark matter that somehow knows about the distribution of baryons and organizes itself appropriately is not being used to inform the construction of dark matter models. The people who do that work seem to be very knowledgeable about cosmological constraints, but their knowledge of galaxy dynamics seems to begin and end with the statement that rotation curves are flat and therefore we need dark matter. That sufficed 40 years ago, but we’ve learned a lot since then. It’s not good enough just to have extra mass. That doesn’t cut it.
So in summary, we have two very different requirements on the dark matter. From a cosmological perspective, we need it to be dynamically cold. Something non baryonic that does not interact with photons or easily with baryons.
From a galactic perspective, we need something that knows intimately about what the baryons are doing. And when one does one thing, the other does a corresponding thing that always adds up to looking like MOND. If it doesn’t add up to looking like MOND, then it’s wrong.
So that’s where we’re at right now. These two requirements are both imperative – and contradictory.
* There is a knee-jerk response to say “mass tells light where to go” that sound wise but is actually stupid. This is a form of misdirection that gives the illusion of deep thought without the bother of actually engaging in it.
Your dictation software is pretty good. Apart from Bosma being converted into , Bob smile I didn’t notice any typos.
my editing is very thorough
but yes – the software has gotten pretty good. there is still room for improvement, but this didn’t work at all five years ago. as you might expect, it struggles most with obscure technical terms.
Even though everything has actually already been said, I always enjoy reading your blog posts. Most interesting are the comments of the galaxy dynamists: dark matter must be baryonic…in a whisper
This reminds me of a lecture by Klaus von Klitzing in Göttingen. At a place, he said analogously: The measured values of all groups converge after a certain time to a certain value…and certain error bars
Also with the quantum Hall effect there is a human factor…
Get well soon
Stefan
Fascinating post!! I think your summary has the earmarks of a very happy thought! 😉
It seems like we have to take steps when describing galactic dark matter in order to not localize it in time. While for the cosmic description, we have to take steps to not localize it in space.
Quite a mystery!
Hey, there! I have been following your posts for a long time. How robust and tested is the g(dm)-g(radial) relationship? I main. Imagine a DM adept tries to falsify your claim. Is that relationship confirmed in every data set of galaxies?
The RAR is very robust (see https://tritonstation.com/2022/02/18/a-brief-history-of-the-radial-acceleration-relation/) and has been confirmed many times (e.g., https://ui.adsabs.harvard.edu/abs/2023arXiv230311314D/abstract).
It is impossible to confirm it for every galaxy in the sky -there are too many of them- so it is always conceivable that we’ll find some that don’t. So far that hasn’t happened. Indeed, quite the contrary – I’ve spent the last 25 years looking for exceptions, specifically pushing the boundaries of knowledge in terms of galaxy mass, surface brightness, gas fraction, etc. Every time I find an object I expect should deviate, it declines to do so. This is not a new fad that can vanish overnight.
There are always a few outlying trees in astronomy, but these have so far inevitably been attributable to poor data quality (https://arxiv.org/abs/1909.02011). The RAR is the forest. I have, in good faith, debunked many dozens, perhaps at this point hundreds, of claims of deviants; at this point I’m more interested in understanding the forest than chasing the latest claim that “oooo, THIS weird galaxy is a little off!”
That is how many claims of falsification go: this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, this galaxy conforms, THIS ONE IS A LITTLE BIT OFF, MOND IS FALSIFIED!!!!!!
Have you thought to use ML and AI-aid to search for deviations in known galaxy data? It is not likely a job for a human, but maybe with the new forthcoming era in Big Data Astronomy, we could use machines to search for RAR deviations (if any!). I don’t know if it can be done right now…Perhaps the data of galaxies is too disperse to search for deviations one to one. I believe, sadly, machines and scientific-AI will be better to search for those type of patterns than we are in the future.
the paper by desmond i cite above essentially does that.
As a teenager, I heard that solving a 5th-degree equation was impossible.
I found that interesting. What happens when moving from 4 to 5?
Why not only at 17 or 385 is also a nice number.
The proof is done with Galois theory and is one of the nicest I know.
The central part is the group theory. Sets that have objects and an operation.
There is a neutral object and an inverse object to each.
I wondered what happens when you add a second neutral element to the set.
How do you have to adjust the rules to avoid contradictions?
It works but doesn’t bring anything new, really exciting. It was worth a try.
Much enemy, much honor.
Stefan
A propos NGC 1277: Anton Petrov, the YouTuber who based yet another death sentence for MOND on the recent paper by Comeron et al. on that galaxy, although the authors themselves explicitly stated that their data could not be used for any kind of verdict on MOND (https://www.youtube.com/watch?v=OJltKxavCk8), has now released yet another video entitled “Is Newtonian Gravity Broken? Evidence for MOND? Or More Bad Science?” (https://www.youtube.com/watch?v=XHcTtTB2Iho). Here, he juxtaposes an image of the recent article by Chae (the Korean scientist who has published data indicating MOND-like behaviour in wide binaries) with one of the widely debunked report about the LK-99 purported superconductor, whose authors are also Korean – no other connection whatsoever, just a naked attempt to smear a scientist for not coming to the conclusions Petrov likes.
I know this is not coming out of the academic sphere, but a lot of opinions get informed by content creators on YouTube, and here there seems to be outright propaganda going on, and any dirty trick goes. Maybe it is connected to the seemingly increasing desperation in the DM camp – dark stars, doubling the age of the universe, anything as long as DM is spared…
Recently, Matt O’Dowd on PBS SpaceTime, who generally has an excellent reputation, has released a video on dark stars (https://www.youtube.com/watch?v=zUhOL38346Y) which takes DM-saving assumptions-stacking and parameter fine-tuning to the next level, only to flippantly dismiss modified gravity in the comments section…
Yes, there is active disinformation. This isn’t new; influencers like Ethan Siegel and Sean Carroll have been talking shit in ignorance for years stretching into decades. There is no more point in engaging with this than with any other troll on social media. Dark matter won the election; inconvenient facts don’t matter.
Dark Matter will remain a problem only as long as top level science community shall keep on projecting it as a problem.
I’ve seen a comment on a science article about Chae’s article proposing that objects from other universes (assuming the multiverse) affect the dynamics of the stars we see.
I think we’re almost over the top of people trying to do their utmost best to avoid accepting MOND. It’s not hard to see the numerous even greater problems that loom over the future of the aforementioned approach.
Agree to that to an extent, but also feel that people tend to let their imaginations run a bit too wild when thinking about concepts that are poorly defined.
There might be some highly restrictive description of a multiverse that I could fit in the old noodle, but it would have to be something like a virtual multiverse for describing observer effects, or for something much more mundane than what most people imagine.
Although the MOND/DM issue is clearly the most important problem in physics today, the astrophysics community does not seem to be devoting the appropriate effort to definitively answering this question experimentally. Although a bunch of experimental tests that have been proposed (eg, arXiv:2306.15939), I assume most are just fantasies. Your responses to my previous queries about experimental tests have been primarily negative. But, I would hope that, with enough money, there is a reasonable measurement, if accurate and long term enough, that could provide a definitive answer. With CERN proposing to spend something like $20 billion on a new accelerator with no idea if there is even anything new that it will find, devoting a fraction of this to the MOND/DM question seems like a no brainer. If, say, NASA were to announce $2 billion for such a project, couldn’t the DM and MOND astrophysics community agree on a promising proposal?
The only plausible experiment (vs observation) that I’m aware of is to mesure the dynamic of test masses in a saddle point near Neptune (see https://www.mdpi.com/2073-8994/14/7/1331, section 11.5.1). Only Voyager 2 visited that planet, and if we ever send something this far it seems there is a lot of predictably interesting science to do. You’re going to compete with many other teams for a spot on the rocket. So, for this to happen you need enough “critical motivation mass”. We’re probably still pretty far from that.
CERN will ask for $20B to do what they do with a straight face, but those same people will go on an unhingred rant about wasting taxpayer money if you suggest spending even one dime on MOND. I’ve seen it happen, even within a collaboration I co-founded on hybrid DM models.
So, no, the DM community will never agree on such a proposal. Fortunately, the experiments you’d like to do to test MOND are not relevant to them. DM folks want bigger underground xenon tanks and bigger colliders. To test MOND, you want to send some ballistic test particles far from the sun, well beyond Pluto. That would be an obvious (and relatively cheap) add-on package to deep space probes the solar system people would like to launch.
Great patience would be required.
@Stacy
You don’t believe in MOND effects showing in a saddle point? Or is it just too hard to detect vs going beyond Pluto and seeing the probe deviate?
Saddle point effects were an intriguing prediction of TeVeS, which is now falsified. Perhaps it was on the right track anyway, so would be good to check. Tried to make predictions for what LISA would see if properly directed through one, but the effects were disappointingly small for viable interpolation functions. So, could be, but doesn’t seem promising.
“From a cosmological perspective, it became apparent in the early 1980s that we needed something extra – not just dark, but non-baryonic. By this time it was easy to believe because people like Vera Rubin and Albert Bosma had already established that we needed more than meets the eyes in galaxies. So dark matter was no longer a radical hypothesis, which it had been in 1970. The paradigm kinda snowballed – it had been around as a possibility since the 1930s, but it was only in the 1970s that it became firmly established dynamically.”
I suspect that the current situation of the domination of the dark matter paradigm over MOND in cosmology is in part a historical coincidence, in which MOND was developed after cosmologists had incorporated dark matter into their standard concordance model. If Milgrom or another physicist had developed MOND a decade earlier around 1973, before the cosmologists got to dark matter, then there would have been two equally radical (and equally valid for the data available at the time) proposals for cosmologists to explain the cosmological and galactic deviations from vanilla general relativity + the FLRW metric. In such a case, it is quite possible that the dark matter vs MOND debate would have been more evenly balanced in cosmology and there wouldn’t be a “standard modal of cosmology” – a situation similar to the Steady State vs Big Bang debate back in the 1950s and 1960s, where there also wasn’t a standard modal of cosmology but rather two competing models in mainstream cosmology.
Yes, there is a distinct hysteresis to the issue.
I’m not so sure. MOND is not compatible with General Relativity the way Dark Matter is. I suspect it would be disliked in the heart, even if favored in the data.
The thing is that cosmologists have already modified gravity in the 1990s by adding the cosmological constant to general relativity. There are potentially measurable deviations from general relativity at the level of the Local Group (https://iopscience.iop.org/article/10.3847/2041-8213/ace90b). The cosmological constant was disliked at first by many people even though it was favoured in the data, but after a decade people got used to it.
So I think that something similar would have happened with MOND: initially it would be disliked by many even though it is favoured by the data, but then people would get used to it and it would just be treated as natural.
yes and no. the cosmological constant was indeed widely despised, and it took a lot for it to gain acceptance: https://tritonstation.com/2019/01/28/a-personal-recollection-of-how-we-learned-to-stop-worrying-and-love-the-lambda/ However, that transition happened simultaneously with some of MOND’s primary predictions coming true. Yet one was accepted and the other not. Why? Lots of reasons, one of which is that Lambda had been posited by the great Einstein himself (albeit for all the wrong reasons). That gave room for people to believe they weren’t breaking GR, just admitting a possibility that had seemed remote. Imagine how it might have gone had Einstein never made his greatest blunder.
https://arxiv.org/abs/2007.00082 There is recent progress on the theory side. Perhaps still a matter of opinion whether MOND is compatible with GR.
Do you think MOND would gain more acceptance, if there was a theoretical explanation for it?
would dark matter ?
that would help, but apparently it needs to also be compelling and simple to understand because the people who most demand such a development actively deride each such step as it is taken.
More generally, I think most people simply don’t want to face it, and won’t unless dark matter is falsified. somehow. until faith in that is shaken, people apparently won’t go there.
“The problem in bound systems is not just galaxy dynamics. It’s also clusters of galaxies. It’s also a star clusters that don’t show a discrepancy. The problem extends over a dynamic range of at least a billion in baryonic mass.”
I wasn’t aware of the star clusters not showing a discrepancy; I am assuming that there is an implication that they should show a discrepancy.
My admittedly naive view would be that next steps might be to further analyze clusters of galaxies and star clusters in an effort to determine the conditions under which MOND effects are seen, whether partially or entirely – or not seen at all. Easier said than done of course; it seems that the bullet cluster has been deeply analyzed without a definitive resolution (but showing tension with both DM and MOND). I don’t know what data exists for clusters, whether clusters are more or less heterogeneous than galaxies, or even how large the population of clusters is compared to the population of galaxies.
generally speaking, a discrepancy is seen where mond predicts it should be, and not when it shouldn’t. Dark matter makes no comparable prediction.
Rich clusters of galaxies are the persistent exception; they evince more of a discrepancy than mond predicts. This problem is real, but is restricted to the most massive clusters, from 1E13 to 1E14 Msun in baryonic mass. mond works in galaxies from 1E5 to 1E12 Msun, so that is a larger dynamic range. LCDM gets the baryon fraction of the most massive (1E14) clusters right, but is already well off at 1E13 – apparently because clusters follow the mass-temperature prediction of mond rather than that of LCDM. So one gets the intercept of the line right while the other gets its slope right.
it’s a mess.
Hello,
A question perhaps a little removed from this post, but is it certain that the data reproduced in this simulation https://arxiv.org/abs/2307.09507 are not predicted by (a version of) MOND (i.e. “challenges for”)?
Many thanks for your blog and best wishes for a speedy recovery,
Norbert
Well, their simulation doesn’t produce a simulated reality that matches the observed one as MOND empirically captures it. Where should the blame go? Hmm…
Standard Model is really a misnomer, at least until it can explain what the mysterious raisins are, and where they came from. It’s only half the story.
They say this, but look at the data. mond comes a lot closer to it (within a factor of 2) than these sims do (off by a factor of ten). https://tritonstation.com/2021/06/28/the-rar-extended-by-weak-lensing/
I was struck by a seeming parallel between Cosmology and the field of superconductivity after reading J. E. Hirsch’s “BCS theory of superconductivity: the world’s largest Madoff scheme?”. As with Cosmology there is a reigning theory to explain superconductivity at a microscopic level known as BCS theory, developed in 1957 by three researchers John Bardeen, Leon Cooper, and Robert Schrieffer, who won the Nobel Prize for their work in 1972. To be sure Hirsch doesn’t question key features of BCS theory such as the formation of Cooper pairs, macroscopic phase coherence of the Cooper pairs, and formation of an energy gap when the Cooper pairs condense into a Bose fluid. Instead he’s critical of the flexing lattice aspect (“electron-phonon interaction”) of BCS theory. As with Cosmology, with the passage of time more and more deviations from the ‘concordance’ model of superconductivity appeared with superconducting materials that could not be explained with BCS theory. This was especially true of the high temperature superconductors initially discovered in 1986. And just like in the Dark Matter paradigm anomalies were post-dicted, rather than predicted by invoking multiple mechanisms to explain them in the BCS picture.
But on the flip side, unlike with MOND, I haven’t seen a clear cut mathematical formulation in Hirsch’s alternative concept “Theory of hole superconductivity”, that contains within it multiple features of superconductivity as MOND does for Cosmology. This appreciation could change if I could find a more formal paper of Hirsch’s ideas.
We (all physicists) imagine a solid body composed of Newtonian spheres (atoms). Heat movement is then imagined as a kind of trembling movement around the zero position.
Although we (all physicists) know that atoms are not small spheres, this model works quite well for many things. (phonons, heat transport, band structure in semiconductors,…).
But maybe superconductivity is an example where this model fails…
Get well soon! Accidents and broken bones suck!
Get well soon doesn’t seem quite right. Broken bones typically take about 6 weeks, right? (Never had one, fortunately.) And wrists are complicated.
I hope everything heals well. Be sure to take physical therapy seriously if recommended.
Thank you for your posts and the perspectives you bring to the table. Should I run across a relevant situation, portraying MOND as a constraint that must be satisfied is far more effective than being dragged into rhetoric about what to believe.
thanks. unfortunately, i’ve done it before, and six weeks is about right. had to have surgery just a couple days ago to stabilize it, so start counting from then, i guess.
is there any combination of MOND + dark matter distribution that succeeds ? i.e. MOND for galaxies + 80% hot dark matter + 20% cold dark matter for galaxies clusters and CMB
not really. i think mond does imply a more general theory than GR; we can’t just graft these pieces together and expect it to come to life like frankenstein’s monster. https://tritonstation.com/2023/03/06/ask-and-receive/
would MOND + some ratio of hot dark matter + warm dark matter explain galaxies clusters where MOND is not enough ?
My brother’s wife fractured the cuboid bone in her ankle while running down a rocky trail with her daughter four years ago. I imagine the ankle has a similar complexity to the wrist. It took about six weeks for it to heal, if I remember correctly. Afterwards she was just fine, able to run, walk, hike, etc. In fact, while dragging kayaks on 2-wheel dollies a half mile from their home on Cape Cod to the beach last summer she outpaced both my brother and I. I do hope that your injury heals up as good as my brother’s wife’s injury did.
thanks. i hope so! i just don’t find myself able to do much at present.
Dear Stacy, can you elaborate a tiny bit on how dark matter solves the cosmological problem of going from perturbations of size 1/100,000 to size 1 over a period when the universe expands by a factor of only 1000?
Is it that the dark matter perturbations have actually been growing for much longer than since z~1000 so they have had time to reach a size of delta=1? I.e. dark matter perturbations have been growing since z~100,000 but the baryons were prevented from collapsing by radiation til the cmb formed at z~1000?
Thanks!
The application of general relativity requires precisely defined energy and momentum at every point in space, which is what quantum mechanics specifically forbids.
The problem of an incompatibility of quantum mechanics and general relativity, one might anticipate, is possibly connected to what Stacy has described as the orthogonal nature of the cosmic and galactic dark matter.
That dark matter may require orthogonal descriptions could be a big clue to further understanding the effect that GR-QM incompatibility has on cosmological models.
right. so the idea is that the dark matter purterbations are already a factor of 100 bigger at z=1000, so thry have only grown by 1000. it looks like more in the temperature fluctuations because the baryons are tied to the photons. Dark matter doesn’t have that restriction, so can be granted a head start.
That was critical to establishing CDM. it does leave a subtle mark on the CMB indirectly through gravity if not directly through E&M. This effect is detected in the power spectrum as the third peak being comparable in amplitude to the second.
How do we know that the observed CMB power spectrum didn’t evolve long after the CMB was first emitted? In other words, when did the peaks get imprinted on the CMB, and how do we know that?
Last month the James Webb telescope spotted three early galaxies that some astronomers believe could actually be “Dark Stars”. A trio of astronomers wrote a paper on the possible existence of Dark Stars – stars powered by dark matter particles annihilating with one another. But that paper was written in 2007 before the parameter space of dark matter particles shrank to almost nothing. My initial reaction on hearing about this in the last few days was skepticism. But I haven’t read the paper yet, or even completely read articles on it like at space-dot-com. It’s possible there might be a more recent paper on such Dark Stars, which I haven’t come across yet. Here’s an article about this at space-dot-com: https://www.space.com/nasa-james-webb-space-telescope-stars-dark-matter
Oops, I goofed. Had I completely read the article I linked in the previous comment I would have seen that a new paper by the same three authors of the 2007 paper on Dark Stars was written just last month. I’ll be very interested to see what this paper says.
People not the authors are very skeptical of the dark stars claim.