Something that Sabine Hossenfelder noted recently on Twitter resonated with me:

This is a very real problem in academia, and I don’t doubt that it is a common feature of many human endeavors. Part of it is just that people don’t know enough to know what they don’t know. That is to say, so much has been written that it can be hard to find the right reference to put any given fever dream promptly to never-ending sleep. However, that’s not the real problem.

The problem is exactly what Sabine says it is. People keep pushing ideas that have been debunked. Why let facts get in the way of a fancy idea?

There are lots of examples of this in my own experience. Indeed, I’ve encountered it so often that I’ve concluded that there is no result so obvious that some bozo won’t conclude exactly the opposite.

I spent a lot of my early career working in the context of non-baryonic dark matter. For a long time, I was enthusiastic about it, but I’ve become skeptical. I continue to work on it, just in case. But I soured on it for good reasons, reasons I have explained repeatedly in exhaustive detail. Some people appreciate this level of detail, but most do not. This is the sort of thing Sabine is talking about. People don’t engage seriously with these problems.

Maybe I’m wrong to be skeptical of dark matter? I could accept that – one cannot investigate this wide universe in which we find ourselves without sometimes coming to the wrong conclusions. Has it been demonstrated that the concerns I raised were wrong? No. Rather than grapple with the problems raised, people have simply ignored them – or worse, assert that they aren’t problems at all without demonstrating anything of the sort. Heck, I’ve even seen people take lists of problems and spin them as virtues.

To give one very quick example, consider the physical interpretation of the Tully-Fisher relation. This has varied over time, and there are many flavors. But usually it is supposed that the luminosity is set by the stellar mass, and the rotation speed by the dark matter mass. If we (reasonably) presume that the stellar mass is proportional to the dark mass, viola – Tully-Fisher. This all sounds perfectly plausible, so most people don’t think any harder about it. No problem at al.

Well, one small problem: this explanation does not work. The velocity is not uniquely set by the dark matter halo. In the range of radii accessible to measurement, the contribution of the baryonic mass is non-negligible in high surface brightness galaxies. If that sounds a little technical, it is. One has to cope at this level to play in the sandbox.

Once we appreciate that we cannot just ignore the baryons, explaining Tully-Fisher becomes a lot harder – in particular, the absence of surface brightness residuals. Higher surface brightness galaxies should rotate faster at a given mass, but they don’t. The easy way to fix this is to suppose that the baryonic mass is indeed negligible, but this leads straight to a contradiction with the diversity of rotation curves following from the central density relation. The kinematics know about the shape of the baryonic mass distribution, not just its total. Solving all these problems simultaneously becomes a game of cosmic whack-a-mole: fixing one aspect of the problem makes another worse. All too often, people are so focused on one aspect of a problem that they don’t realize that their fix comes at the expense of something else. It’s like knocking a hole in one side of a boat to obtain material to patch a hole in the other side of the same boat.

Except they are sure. Problem solved! is what people want to hear, so that’s what they hear. Nobody bothers to double check whether the “right” answer in indeed right when it agrees with their preconceptions. And there is always someone willing to make that assertion.

43 thoughts on “Let’s just ignore it

  1. Devolving into cultist mindsets seems to be intrinsic human nature, independently of academic credentials.

    Rejecting the reality of the luminiferous ether was not as painful as rejecting the reality of dark matter seems to be, but the economic interests behind the dark matter myth are astronomical compared to all economic interests that were behind the luminiferous ether at its peak.

    More is Different, it seems.


  2. “The kinematics know about the shape of the baryonic mass distribution, not just its total.”

    That inarguable fact and the efficacy of MOND in galactic systems makes it clear that our solar system derived gravitational models are in fact not universal laws. The inability of the scientific academy to come to grips with the limits of our knowledge is as scientifically inexcusable as it is pervasive across theoretical physics.

    As a consequence, the two standard models of cosmology and particle physics are baroque, illogical, and physically absurd, being heavily populated with invisible entities and events, the only purpose of which is to sustain the hubristic notion that our febrile human imaginations have discerned universal laws by chicken-scratching in a minute corner of the Cosmos. We then insist that the vast Cosmos we have subsequently observed bow down to our superior wisdom. What a mess.

    Liked by 1 person

    1. “The kinematics know about the shape of the baryonic mass distribution, not just its total.”

      Could you elaborate on how MOND tells us ABOUT our physical reality (as opposed to just giving us a total/sum like the standard models.of cosmology do). In other words, can you expand on the ‘efficacy’ of MOND comparative to the current standards to cosmic systems. You have a good angle of comparing the paradigms via procedural flaws in logic versus substantive sums


      1. MOND describes a gravitational effect, the flat rotation curves of galaxies, by amending Newtonian dynamics with a mathematical formalism related to the acceleration scale of disk galaxies. MOND does not describe the cause of the observed gravitational effect but in this it is no different than Newtonian dynamics or General Relativity, neither of which describes the cause of observed gravitational effects.

        MOND also makes predictions regarding the behavior of galactic systems that Dark Matter models have only post-dicted. Dr. McGaugh has extensively documented those predictions on this blog and also in this paper: which was linked in the above post.

        Coupled with the empirical Baryonic Tully-Fisher and Radial Acceleration Relations, MOND provides strong evidence that the failure of our gravitational models to accurately predict the rotation curves lies with the models themselves, not with the existence of some invisible substance that can be deployed as needed to make the models work properly.


  3. I had breakfast with a friend this morning. In a café with a view of the Alps. Everything he touches turns to gold after a while. He has a PhD in chemistry and excellent business sense.
    Regarding dark matter and MOND, one can say: there are two groups, one is big and powerful, the other small and weak.
    – die eine ist groß und mächtig, die andere klein und schmächtig –
    But the small group has the better arguments.
    The next big thing will be a physical theory of MOND. You will find me there.

    Liked by 1 person

    1. “The next big thing will be a physical theory of MOND. You will find me there”.

      I’ve heard statements like this before, and I’m never quite sure what is meant. What is a “physical theory”, and how does it differ from a “theory”?

      My next question would be: What do you prefer: a “physical” theory that you know is wrong, or an “unphysical” theory that you know is on the right track?


      1. What distinguishes a “physical theory” from a “theory”?
        Difficult question.
        I distinguish a “physical theory” from an “engineering theory.”

        I see it like this:
        In a “physical theory,” we have objects with properties that then create our world.
        To satisfy Occam’s razor, one should/must be sparing with the properties.
        The objects should/may therefore have only a few properties.
        The objects should not be able to do magic.

        In an “engineering theory,” one only has mathematics.
        The goal is to achieve an exact result.
        Whether there is a physical counterpart to the mathematical means is unimportant.
        The more mathematics one uses, the less physical everything becomes.
        One can also replace physical here with real or existent.

        Examples of physical models are then:
        1. Understanding planets as mass points is a physical theory and is suitable for the planetary system.
        2. new particles (WIMPs) are a physical theory suitable for “dark matter.”
        3. thinking of atoms as mass points is a physical theory suitable for the kinetic theory of gases.

        Examples of ingenious theories are then:
        1. epicycles in the planetary system.
        It is only essential that the position of the planets in the sky is well predicted.
        How the planets achieve this is unimportant.
        If the planets orbit the sun, this is easier for the planets.
        They need less magic than when they orbit the Earth and need epicycles.
        2. As is well known, QFT generates very accurate results.
        How the electrons achieve this is unimportant.

        I don’t know of any publication that classifies theories in this way.
        But I have not looked for it either.

        From a certain point on, all physical models certainly require
        an ingenious continuation to deliver a result,
        a number with a physical unit.
        Otherwise, everything remains vague and is not even wrong.

        I immediately discard a “physical theory” that is wrong.
        … and therefore have an overflowing wastepaper basket.

        An “unphysical/engineering theory” is always on the right track, in my opinion,
        if it gives more and more accurate results.
        But an “unphysical/engineering theory” is, in my opinion, also always on the wrong path,
        because it moves further and further away from reality.
        In their idealization, mathematical objects are non-existent but very useful.

        Regarding your second question, I prefer a physical theory with little engineering.

        As can be seen above, there is only one physical model.
        Ex. 1 to 3 is always the same (point masses).
        Physics has lived very well on this for the last four centuries.
        Meanwhile, it is time for something new.

        Stacy has said everything about MOND.
        And I found what was said very convincing.
        So I don’t waste a thought on dark matter. (Dark matter particles need too much magic).
        For MOND (and Newton), a physical theory might be the next step.

        My preferred candidate for a next physical model
        is a network with many nodes and edges.



        1. What if there is no base state, just feedback loops?
          The nodes as much a function of the networks, as the networks of the nodes.
          The microcosmic as much a function of the macrocosmic, as the macro is of the micro?
          Physical theory likes to dismiss space as just a function of measuring position, but the assumption the micro is most elemental is based on our experience of space. That things are little before they are big.
          It does seem every effort to pin things down has us running in circles. Maybe it’s trying to tell us something.
          We assume things start simple and become more complex, but there seems to be a cycle there, where complexity is constantly breaking down, to simpler forms, either because the forces driving it run down, or contradictions and conflicts build up. So is simplicity a base state, or just part of the cycle, of building up and breaking down?
          What is an edge, but part of a pattern? Wouldn’t there have to be an even smaller level of detail, than Planck length, or the error bars would be the same size? Maybe edges are emergent, from interactions/process?
          Is quantization emergent and we like it, because it can be measured?


          1. yes, possible. Sometimes I have thoughts like that too.
            But at the moment I’m satisfied if I can use a network to modell
            the 3-dimensional space with.


            1. Mapping and modeling reality is essential. It is what our brains are designed to do. Signals from the noise.
              The problem is when people think it’s the voice of god talking in their ear.
              Idealizations are not absolutes.
              What I find useful is to be able to understand and use multiple maps as frames of reference.
              The really crazy people are not the ones arguing with themselves, but the ones who never argue with themselves.
              Consider religious conflicts; The Israeli, Palestinian one, for example; They use different timelines/narratives to map the same space.
              Three dimensions are a mapping device, like longitude, latitude and altitude. What if you have a space, described by two different maps, using slightly different vectors. Does that make it six dimensional?
              Each of us is the center point of our entire view and experience of the entire universe. Given sufficient computing power, that could presumably be modeled, since it is our personal reality. Like epicycles put the earth at the center. Yet each of us has a slightly different framework, both the three dimensions of space and the one of time.


  4. As I commented in the previous thread, I got into reading about physics as a way to understand society and the big reason I think synchronization is worth considering, as an explanation for gravity, is that it’s the dynamic working in society, where it’s more important to be on the biggest wavelength than anything else. Just look at organized religion. If the central story just hits a few points, it doesn’t matter how far fetched all the details might be. In fact, the more unbelievable the better, as it sorts the true believers from the merely social.
    It becomes that grain of sand at the center of the pearl. The black hole/rock at the center, around which everyone spins.
    It certainly is occurring in the current political situation, where the two poles of liberal and conservative grow ever more fanatic and anyone trying to avoid being drawn in has no shot of being heard.
    People want answers, not truths. That’s why there are so many priests and politicians, while the philosophers are intellectually neutered and confined to the back alleys of academia.


    1. As Max Planck said: “Science advances one funeral at a time”. Unlike you I don’t have any faith in the philosophers either (certainly not those in History and Philosophy of Science where I know enough to see through their hand-waving).


      1. I don’t know that I have much faith in philosophers, so much as observing the premise of philosophy is to get at the facts, whatever they may be. Obviously not very successfully. Stoicism seems to be the current default.
        For one thing, much of Western culture is based on the principle of ideals as absolute, from monotheism to platonic math. Ideals are NOT absolutes. How difficult is that to figure out?
        As for the one funeral at a time, it seems that by the time the next generation has moved to the forefront, they are pretty well indoctrinated.


    2. The need for truth in society is a good point. However, if the truth is what you want to know there’s no need to worry about the way society is going – for the truth is absolute and will remain the same always. And in the end, the truth will always turn up and be revealed – it’s not even depending on what we do. Regardless, for everyone it’s way better to acknowledge the truth earlier, once alerted to it. Because the other road (ignoring it), when unconditionally followed, turns into madness. Thus arguing for the truth can save people from this madness, even if they only give in slightly.


      1. The truth is that what is good for the fox, is bad for the chicken and you are never going to get them both to accept the same ideals.
        To culture, good and bad are some cosmic conflict between the forces of righteousness and evil, while in nature, it’s the basic biological binary of beneficial and detrimental. The 1/0 of sentience. That’s because it is the function of culture to synchronize society as one larger organism, using the same languages, rules and measures, but the totem at the center of one social entity is not the same as the totem at the center of other social entities. It’s like galaxies all have their own black holes at the center.
        Nodes and networks, organisms and ecosystems. Synchronization and harmonization. Focus and equilibrium.


  5. Stacy, are you going to write anything about the new paper by Sargent, Deur, and Terzic, “Hubble Tension and Gravitational Self-Interaction” arXiv:2301.10861 (January 25, 2023)?


    1. Yes, I wonder also about Stacy’s take on GR SI. How well do the generic statements on several galaxy types hold up? That (almost) spherically symmetric galaxies appear to have less additional gravity, and so on?


      1. I will have more to say about the Hubble tension soon. I do think it possible, even plausible at this point, that it represents some divergence from pure FLRW. Whether this is the right way to do it is beyond my capacity to comment at present. There are many things to take care about. For example, spherical galaxies needing “less additional gravity” is true, from a certain point of view: the centers of elliptical galaxies are usually in the Newtonian regime, so don’t clearly show the excess traced by the more extended rotation curves of spiral galaxies. Whether they would do so if traced far enough out, well, yes, mostly they do, within the large uncertainties.


  6. “Why let facts get in the way of a fancy idea?” Because it was ever thus, people cling to their ideas until and unless what they consider to be a more compelling idea arrives. But at least in this instance scientific folly is killing no-one, in stark contrast to how science misanalysed scurvy.

    Scurvy has killed untold millions, and the knowledge that it is a dietary problem has been discovered and lost many times. So in the 18th century, when the Royal Navy was losing some 50% of its crews on long sea voyages to death by scurvy, the news that it could be prevented by eating lemons spread like wildfire through the fleet. By the time of Nelson, the practice of eating lemons was firmly established, and scurvy was eliminated from the Royal Navy, enabling stunning navy victories. But medical theory moves on, and it proceeded to decide that scurvy was caused by disease. So the practice of eating lemons lapsed, and scurvy returned to the Royal Navy, the causes for which were debated by learned physicians late into the night. Never, to turn a phrase, were so many killed by so few.

    Lacking a “vitamin c” moment, don’t expect the debate around “dark matter” to become any saner.


      1. Yes, that’s the take from the scurvy story. Doctors were so captivated by the theory that scurvy was a disease that they dismissed the experimental evidence that it was a dietary problem. Fast forward to MOND v LCDM, and history seems to rhyme!


  7. I noticed the comments “Stacy, are you going to write anything about the new paper by Sargent, Deur, and Terzic, “Hubble Tension and Gravitational Self-Interaction” arXiv:2301.10861 (January 25, 2023)?” and “Yes, I wonder also about Stacy’s take on GR SI. How well do the generic statements on several galaxy types hold up?”

    I tried thinking about that idea, and read the papers. I’m not an expert on GR but know enough about the theory of low-energy particle (graviton based) theory that’s its an extremely tricky subject that clearly requires somebody extremely versed in both classical GR and particle based theory to get it right. Deur’s articles leave me with the impression that the logic and the math needs lots of checking, peer-reviewed or not … with reviewing by unqualified “peers” being rather likely. Or is it hand-waving?

    Stacy … your take?


  8. Old, discredited, theories (phlogiston, epicycles, aether, etc. ) are taught in school as if that sort of thinking was done by foolish people, and today’s scientists are sooo much smarter. It’s not stated overtly, but that is the underlying message, in my opinion.
    But it just aint so. Scientists (amateur and professional) are, always have been, and always will be (unless evolution increases our intelligence significantly) just as prone to mistaken views, and stubborn adherence to them in spite of the evidence, as were the ancients.
    I’m not saying Sabina is wrong, I’m just saying that it is NOT new, it has always been this way. However, given enough time, real science does self correct. It just can take a lot of funerals to do it. It may not seem so to those in the middle of such things, as Sabina and Stacy are, but things will sort out eventually.
    As long as there is no government or religious body dictating what is “correct” science then all that is really needed is persistence and patience, neither of which are actually easy.
    I also would like to see things improve faster but there isn’t likely to be a way to do that.


    1. I agree, the way it is taught at school is usually too easily presented. Smart or stupid or whatever the contrast presented is, for those few who want to do real science these stories are way too superficial


    2. Trial and error.
      It seems the problem isn’t so much the religion, or government, as it is the bureaucracy of large institutions, that can’t steer nimbly and therefore cannot easily admit error, but can easily run over anyone who disagrees.
      They start off under the innovators giving the initial framework, then the managers take over from there. Think Steve Jobs, then Tim Cook. Then after going through all potential, it’s just those who cannot imagine any alternative, so have to police the edges. Meanwhile the directional control, leadership, devolves into some sort of parody or pantomime of what worked in the past.


    1. I haven’t read the complete paper, or other papers related to it, but this approach to solving the dark sector mysteries is based on “Refracted Gravity”, which I never heard about before. In this hypothesis gravitons, which mediate the gravitational force, are slowed by a gravitational potential below the speed of light, so can undergo refraction like photons passing between different mediums, if I understand it correctly. Apparently it is consistent with GR’s curved space-time paradigm, but unlike GR can explain dark sector puzzles. It sounds very interesting, but its something experts need to weigh in on.


      1. In my previous comment I jumped to too simple an assessment of how the Refracted Gravity hypothesis works, where I thought it had something do with a variable light speed. In the paper titled “Dynamics of galaxies and clusters in refracted gravity”, by Titos Matsakos and Antonaldo Diaferio, the mechanism involves a proposed “gravitational permittivity”. To quote a part of the abstract of the paper: “Inspired by the behavior of electric fields in matter, refracted gravity introduces a gravitational permittivity that depends on the local mass density and modifies the standard Poisson equation”.


        1. Yikes, I goofed in the previous comment. I meant to say variable speed of gravity not “variable light speed”. It’s way past my bedtime, I’m half asleep.


    2. I had not. Any theory has to reproduce the successes of MOND, which is closely tied to the acceleration scale. The is most closely reproduced by a surface density (not a volume density). Even so, one has to go through the Poisson equation to relate them, and acceleration appears to be more fundamental: the empirical correlations revolve more around it than the corresponding surface density scale. I would have to read and digest the specific paper to say more.


      1. What specifically do you mean by surface density in the current context, the surface density of the sphere near the galactic core or the surface density of the circumference at the far reaches of the disk? Physically, it would seem that close to the core the spherical surface density relation would dominate but as the orbital radius under consideration moved outward in the disk plane the spherical 1/r^2 relation would gradually give way to the 1/r circumferential relation of the disk geometry.

        It seems to me that MOND manages that transition mathematically without treating the geometry of the system. The onset of the low acceleration regime being just the point at which the disk geometry becomes predominate. That may be more convenient mathematically but I don’t see how it makes MOND more fundamental in a physically meaningful way.

        As far as refracted gravity goes it would seem they are working hard to reinvent the wheel by attributing to the unobserved gravitational field characteristics that the electromagnetic field already possesses:

        The Radiation Density Gradient II


      2. I had not. Any theory has to reproduce the successes of MOND, which is closely tied to the acceleration scale. The is most closely reproduced by a surface density (not a volume density).

        could you blog on a surface density since creating a gravity theory may be easier than acceleration scale


  9. professor McGaugh,

    have you heard of refracted gravity?

    instead of acceleration based MOND ao, refracted gravity theorists propose density based modification, and by proposing density rather than acceleration they suggest is easier to generalize

    with Refracted Gravity, a novel classical theory of gravity introduced in 2016, where the modification of the law of gravity is instead regulated by a density scale.

    Dark Coincidences: Small-Scale Solutions with Refracted Gravity and MOND
    Valentina Cesare

    General relativity and its Newtonian weak field limit are not sufficient to explain the observed phenomenology in the Universe, from the formation of large-scale structures to the dynamics of galaxies, with the only presence of baryonic matter. The most investigated cosmological model, the ΛCDM, accounts for the majority of observations by introducing two dark components, dark energy and dark matter, which represent ∼95% of the mass-energy budget of the Universe. Nevertheless, the ΛCDM model faces important challenges on the scale of galaxies. For example, some very tight relations between the properties of dark and baryonic matters in disk galaxies, such as the baryonic Tully-Fisher relation (BTFR), the mass discrepancy-acceleration relation (MDAR), and the radial acceleration relation (RAR), which see the emergence of the acceleration scale a0≃1.2×10−10 m s−2, cannot be intuitively explained by the CDM paradigm, where cosmic structures form through a stochastic merging process. An even more outstanding coincidence is due to the fact that the acceleration scale a0, emerging from galaxy dynamics, also seems to be related to the cosmological constant Λ. Another challenge is provided by dwarf galaxies, which are darker than what is expected in their innermost regions. These pieces of evidence can be more naturally explained, or sometimes even predicted, by modified theories of gravity, that do not introduce any dark fluid. I illustrate possible solutions to these problems with the modified theory of gravity MOND, which departs from Newtonian gravity for accelerations smaller than a0, and with Refracted Gravity, a novel classical theory of gravity introduced in 2016, where the modification of the law of gravity is instead regulated by a density scale.

    Comments: 34 pages, 7 figures, published on 16th January 2023 in Universe 2023, 9(1), 56, in the Special Issue “Modified Gravity and Dark Matter at the Scale of Galaxies”; accepted for publication on 12th January 2023
    Subjects: Astrophysics of Galaxies (astro-ph.GA)
    Cite as: arXiv:2301.07115 [astro-ph.GA]


    1. One of the things I enjoy about it it’s that it does explain why spherical systems appear to have less gravitational discrepancy,


      1. [Submitted on 23 Sep 2021 (v1), last revised 25 Sep 2021 (this version, v2)]
        Covariant Formulation of refracted gravity
        Andrea Pierfrancesco Sanna, Titos Matsakos, Antonaldo Diaferio

        We propose a covariant formulation of refracted gravity (RG), a classical theory of gravity based on the introduction of the gravitational permittivity — a monotonic function of the local mass density — in the standard Poisson equation. The gravitational permittivity mimics the dark matter phenomenology.

        Since Ξ plays a role roughly similar to the cosmological constant Λ in the standard model and has a comparable value, CRG suggests a natural explanation of the known relation a0∼Λ1/2. CRG thus appears to describe both the dynamics of cosmic structure and the expanding Universe with a single scalar field, and falls within the family of models that unify the two dark sectors, highlighting a possible deep connection between phenomena currently attributed to dark matter and dark energy separately.

        Comments: 16 pages, 1 figure, small change in the title, submitted to Physical Review D
        Subjects: Cosmology and Nongalactic Astrophysics (astro-ph.CO); Astrophysics of Galaxies (astro-ph.GA); General Relativity and Quantum Cosmology (gr-qc); High Energy Physics – Theory (hep-th)
        Cite as: arXiv:2109.11217 [astro-ph.CO]

        Dynamics of DiskMass Survey galaxies in refracted gravity
        Valentina Cesare, Antonaldo Diaferio, Titos Matsakos, Garry Angus

        We test if Refracted Gravity (RG) can describe the dynamics of disk galaxies without resorting to dark matter. RG is a classical theory of gravity where the standard Poisson equation is modified by the gravitational permittivity, ϵ, a universal monotonic function of the local mass density. We use the rotation curves and the vertical velocity

        arXiv:2003.07377 [astro-ph.G


  10. It’s interesting how both Alexandre Deur and Titos Matsakos/Antonaldo Diaferio invoke analogies with known Standard Model physics; QCD behavior and electric field properties respectively, in their efforts to solve the dark sector problem in astrophysics. This strategy is evidently a fruitful way to advance our knowledge of nature’s workings, and I believe has been used in the past.


    1. Titos Matsakos/Antonaldo Diaferio

      we know MOND falls in galaxies clusters and neutrinos probably won’t have enough mass

      refracted Gravity, a novel classical theory of gravity introduced in 2016, where the modification of the law of gravity is instead regulated by a density scale. refracted gravity (RG), a classical theory of gravity based on the introduction of the gravitational permittivity

      modification of the law of gravity is instead regulated by a density scale might succeeds in galaxies clusters where MOND falls


      1. It’s very encouraging and refreshing that the RG and SI research programs are challenging the dominant Dark Matter/Energy paradigm. The scientists developing these theories correctly understand that MOND phenomenology can’t just be swept under the rug, and have devised physical mechanisms to explain the “coincidences” inherent in that phenomenology. Determining whether either of these approaches, or some other one, underlie the physics of our universe will undoubtedly test the ingenuity of theorists and astronomical data gathering technology alike.


Comments are closed.