We are visual animals. What we see informs our perception of the world, so it often helps to make a sketch to help conceptualize difficult material. When first confronted with MOND phenomenology in galaxies that I had been sure were dark matter dominated, I made a sketch to help organize my thoughts. Here is a scan of the original dark matter tree that I drew on a transparency (pre-powerpoint!) in 1995:

The original dark matter tree.

At the bottom are the roots of the problem: the astronomical evidence for mass discrepancies. From these grow the trunk, which splits into categories of possible solutions, which in turn branch into ever more specific possibilities. Most of these items were already old news at the time: I was categorizing, not inventing. Indeed, some things have been rebranded over time without changing all that much, with strange nuggets now being known as macros (a generalization to describe dark matter candidates of nuclear density) and asymmetric gravity becoming MOG. The more things change, the more they stay the same.

I’ve used this picture many times in talks, both public and scientific. It helps to focus the mind. I updated it for the 2012 review Benoit Famaey wrote (see our Fig. 1), but I don’t think I really improved on the older version, which Don Lincoln had adapted for the cover illustration of an issue of Physics Teacher (circa 2013), with some embellishment by their graphic artists. That’s pretty good, but I prefer my original.

Though there are no lack of buds on the tree, there have certainly been more ideas for dark matter candidates over the past thirty years, so I went looking to see if someone had attempted a similar exercise to categorize or at least corral all the ideas people have considered. Tim Tait made one such figure, but you have to already be an expert to make any sense of it, it being a sort of Venn diagram of the large conceptual playground that is theoretical particle physics.

There is also this recent figure by Bertone & Tait:

This is nice: well organized and pleasantly symmetric, and making good use of color to distinguish different types of possibilities. One can recognize many of the same names from the original tree like MACHOs and MOND, along with newer, related entities like Macros and TeVeS. Interestingly, WIMPs are not mentioned, despite dominating the history of the field. They are subsumed under supersymmetry, which is now itself just a sub-branch of weak-scale possibilities rather than the grand unified theory of manifest inevitability that it was once considered to be. It is a sign of how far we have come that the number one candidate, the one that remains the focus of dozens of large experiments, doesn’t even come up by name. It is also a sign of how far we have yet to go that it seems preferable to many to invent new dark matter candidates than take seriously alternatives that have had much greater predictive success.

A challenge one faces in doing this exercise is to decide which candidates deserve mention, and which are just specific details that should be grouped under some more major branch. As a practical matter, it is impossible to wedge everything in, nor does every wild idea we’ve ever thought up deserve equal mention: Kaluza-Klein dark matter is not a coequal peer to WIMPs. But how do we be fair about making that call? It may not be possible.

I wanted to see how the new diagram mapped to the old tree, so I chopped it up and grafted each piece onto the appropriate branch of the original tree:

New blossoms on the old dark matter tree.

This works pretty well. It looks like the tree has blossomed with more ideas, which it has. There are more possibilities along well-established branches, and entirely new branches that I could only anticipate with question marks that allowed for the possibility of things we had not yet thought up. The tree is getting bushy.

Ultimately, the goal is not to have an ever bushier tree, but rather the opposite: we want to find the right answer. As an experimentalist, one wants to either detect or exclude specific dark matter candidates. As an scientist, I want to apply the wealth of observational knowledge we have accumulated like a chainsaw in the hands of an overzealous gardener to hack off misleading branches until the tree has been pruned down to a single branch, the one (and hopefully only one) correct answer.

As much as I like Bertone & Tait’s hexagonal image, it is very focused on ideas in particle physics. Five of the six branches are various forms of dark matter, while the possibility of modified gravity is grudgingly acknowledged in only one. It is illustrated as a dull grey that is unlike the bright, cheerful colors granted to the various flavors of dark matter candidates. To be sure, there are more ideas for solutions to the mass discrepancy problem from the particle physics than anywhere else, but that doesn’t mean they all deserve equal mention. One looking at this diagram might get the impression that the odds of dark matter:modified gravity are 5:1, which seems at once both biased against the latter and yet considerably more generous than its authors likely intended.

There is no mention at all of the data at the roots of the problem. That is all subsumed in the central DARK MATTER, as if we’re looking down at the top of the tree and recognize that it must have a central trunk, but cannot see its roots. This is indeed an apt depiction of the division between physics and astronomy. Proposed candidates for dark matter have emerged primarily from the particle physics community, which is what the hexagon categorizes. It takes for granted the evidence for dark matter, which is entirely astronomical in nature. This is not a trivial point; I’ve often encountered particle physicists who are mystified that astronomers have the temerity of think they can contribute to the dark matter debate despite 100% (not 90%, nor 99%, nor even 99.9%, but 100%) of the evidence for mass discrepancies stemming from observations of the sky. Apparently, our job was done when we told them we needed something unseen, and we should remain politely quiet while the Big Brains figure it out.

For a categorization of solutions, I suppose it is tolerable if dangerous divorced from the origins of the problem to leave off the evidence. There is another problem with placing DARK MATTER at the center. This is a linguistic problem that raises deep epistemological issues that most scientists working in the field rarely bother to engage with. Words matter; the names we use frame how we think about the problem. By calling it the dark matter problem, we presuppose the answer. A more appropriate term might be mass discrepancy, which was in use for a while by more careful-minded people, but it seems to have fallen into disuse. Dark matter is easier to say and sounds way more cool.

Jacob Bekenstein pointed out that an even better term would be acceleration discrepancy. That’s what we measure, after all. The centripetal acceleration in spiral galaxies exceeds that predicted by the observed distribution of visible matter. Mass is an inference, and a sloppy one at that: dynamical data only constrain the mass enclosed by the last measured point. The total mass of a dark matter halo depends on how far it extends, which we never observe because the darn stuff is invisible. And of course we only infer the existence of dark matter by assuming that the force law is correct. That gravity as taught to us by Einstein and Newton should apply to galaxies seems like a pretty darn good assumption, but it is just that. By calling it the dark matter problem, we make it all about unseen mass and neglect the possibility that the inference might go astray with that first, basic assumption.

So I’ve made a new picture, placing the acceleration discrepancy at the center where it belongs. The astronomical observations that inform the problem are on the vertical axis while the logical possibilities for physics solutions are on the horizontal axis. I’ve been very spare in filling in both: I’m trying to trace the logical possibilities with a minimum of bias and clutter, so I’ve retained some ideas that are pretty well excluded.

For example, on the dark matter side, MACHOs are pretty well excluded at this point, as are most (all?) dark matter candidates composed of Standard Model particles. Normal matter just doesn’t cut it, but I’ve left that sector in as a logical possibility that was considered historically and shouldn’t be forgotten. On the dynamical side, one of the first thoughts is that galaxies are big so perhaps the force law changes at some appropriate scale much large than the solar system. At this juncture, we have excluded all modifications to the force law that are made at a specific length scale.

The acceleration discrepancy diagram.

There are too many lines of observational evidence to do justice to here. I’ve lumped an enormous amount of it into a small number of categorical bins. This is not ideal, but some key points are at least mentioned. I invite the reader to try doing the exercise with pencil and paper. There are serious limits imposed by what you can physically display in a font the eye can read with a complexity limited to that which does not make the head explode. I fear I may already be pushing both.

I have made a split between dynamical and cosmological evidence. These tend to push the interpretation one way or the other, as hinted by the colors. Which way one goes depends entirely on how one weighs rather disparate lines of evidence.

I’ve also placed the things that were known from the outset of the modern dark matter paradigm closer to the center than those that were not. That galaxies and clusters of galaxies needed something more than meets the eye was known, and informed the need for dark matter. That the dynamics of galaxies over a huge range of mass, size, surface brightness, gas fraction, and morphology are organized by a few simple empirical relations was not yet known. The Baryonic Tully-Fisher Relation (BTFR) and the Radial Acceleration Relation (RAR) are critical pieces of evidence that did not inform the construction of the current paradigm, and are not satisfactorily explained by it.

Similarly for cosmology, the non-baryonic cold dark matter paradigm was launched by the observation that the dynamical mass density apparently exceeds that allowed for normal matter by primordial nucleosynthesis. This, together with the need to grow the observed large scale structure from the very smooth initial condition indicated by the cosmic microwave background (CMB), convinced nearly everyone (including myself) that there must be some new form of non-baryonic dark matter particle outside the realm of the Standard Model. Detailed observations of the power spectra of both galaxies and the CMB are important corroborating observations that did not yet exist at the time the idea took hold. We also got our predictions for these things very wrong initially, hence the need to change from Standard CDM to Lambda CDM.

Most of the people I have met who work on dark matter candidates seem to be well informed of cosmological constraints. In contrast, their knowledge of galaxy dynamics often seems to start and end with “rotation curves are flat.” There is quite a lot more to it than that. But, by and large, they stopped listening at “therefore we need dark matter” and were off and running with ideas for what it could be. There is a need to reassess the viability of these ideas in the light of the BTFR and the RAR.

People who work on galaxy dynamics are concerned with the obvious connections between dynamics and the observed stars and are inclined to be suspicious of the cosmological inference requiring non-baryonic dark matter. Over the years, I have repeatedly been approached by eminent dynamicists who have related in hushed tones, less the cosmologists overhear, that the dark matter must be baryonic. I can understand their reticence, since I was, originally, one of those people who they didn’t want to have overhear. Baryonic dark mater was crazy – we need more mass than is allowed by big bang nucleosynthesis! I usually refrained from raising this issue, as I have plenty of reasons to sympathize, and try to be a sympathetic ear even when I don’t. I did bring it up in an extended conversation with Vera Rubin once, who scoffed that the theorists were too clever by half. She reckoned that if she could demonstrate that Ωm = 1 in baryons one day, that they would have somehow fixed nucleosynthesis by the next. Her attitude was well-grounded in experience.

A common attitude among advocates of non-baryonic dark matter is that the power spectrum of the CMB requires its existence. Fits to the data require a non-baryonic component at something like 100 sigma. That’s pretty significant evidence.

The problem with this attitude is that it assumes General Relativity (GR). That’s the theory in which the fits are made. There is, indeed, no doubt that the existence of cold dark matter is required in order to make the fits in the context of GR: it does not work without it. To take this as proof of the existence of cold dark mater is entirely circular logic. Indeed, that we have to invent dark matter as a tooth fairy to save GR might be interpreted as evidence against it, or at least as an indication that there might exist a still more general theory.

Nevertheless, I do have sympathy for the attitude that any idea that is going to work has to explain all the data – including both dynamical and cosmological evidence. Where one has to be careful is to assume that the explanation we currently have is unique – so unique that no other theory could ever conceivably explain it. By that logic, MOND is the only theory that uniquely predicted both the BTFR and the RAR. So if we’re being even-handed, cold dark matter is ruled out by the dynamical relations identified after its invention at least as much as its competitors are excluded by the detailed, later measurement of the power spectrum of the CMB.

If we believe all the data, and hold all theories to the same high standard, none survive. Not a single one. A common approach seems to be to hold one’s favorite theory to a lower standard. I will not dignify that with a repudiation. The challenge with data both astronomical and cosmological, is figuring out what to believe. It has gotten better, but you can’t rely on every measurement being right, or – harder to bear in mind – actually measure what you want it to measure. Do the orbits of gas clouds in spiral galaxies trace the geodesics of test particles in perfectly circular motion? Does the assumption of hydrostatic equilibrium in the intracluster medium (ICM) of clusters of galaxies provide the same tracer of the gravitational potential as dynamics? There is an annoying offset in the acceleration scale measured by the two distinct methods. Is that real, or some systematic? It seems to be real, but it is also suspicious for appearing exactly where the change in method occurs.

The characteristic acceleration scale in extragalactic systems as a function of their observed baryonic mass. This is always close to the ubiquitous scale of 10-10 m/s/s first recognized by Milgrom. There is a persistent offset for clusters of galaxies that occurs where we switch from dynamical to hydrostatic tracers of the potential (Fig. 48 from Famaey & McGaugh 2012).

One will go mad trying to track down every conceivable systematic. Trust me, I’ve done the experiment. So an exercise I like to do is to ask what theory minimizes the amount of data I have to ignore. I spent several years reviewing all the data in order to do this exercise when I first got interested in this problem. To my surprise, it was MOND that did best by this measure, not dark matter. To this date, clusters of galaxies remain the most problematic for MOND in having a discrepant acceleration scale – a real problem that we would not hesitate to sweep under the rug if dark matter suffered it. For example, the offset the EAGLE simulation requires to [sort of] match the RAR is almost exactly the same amplitude as what MOND needs to match clusters. Rather than considering this to be a problem, they apply the required offset and call it natural to have missed by this much.

Most of the things we call evidence for dark matter are really evidence for the acceleration discrepancy. A mental hang up I had when I first came to the problem was that there’s so much evidence for dark matter. That is a misstatement stemming from the linguistic bias I noted earlier. There’s so much evidence for the acceleration discrepancy. I still see professionals struggle with this, often citing results as being contradictory to MOND that actually support it. They seem not to have bothered to check, as I have, and are content to repeat what they heard someone else assert. I sometimes wonder if the most lasting contribution to science made by the dark matter paradigm is as one giant Asch conformity experiment.

If we repeat today the exercise of minimizing the amount of data we have to disbelieve, the theory that fares best is the Aether Scalar Tensor (AeST) theory of Skordis & Zlosnik. It contains MOND in the appropriate limit while also providing an excellent fit to the power spectrum of galaxies and the CMB (see also the updated plots in their paper). Hybrid models struggle to do both while the traditional approach of simply adding mass in new particles does not provide a satisfactory explanation of the MOND phenomenology. They can be excluded unless we indulge in the special pleading that invokes feedback or other ad hoc auxiliary hypotheses. Similarly, more elaborate ideas like self-interacting dark matter were dead on arrival for providing a mechanism to solve the wrong problem: the cores inferred in dark matter halos are merely a symptom of the more general MONDian phenomenology; the proposed solution addresses the underlying disease about as much as a band-aid helps an amputation.

Does that mean AeST is the correct theory? Only in the sense that MOND was the best theory when I first did this exercise in the previous century. The needle has swung back and forth since then, so it might swing again. But I do hope that it is a step in a better direction.

105 thoughts on “Artistic license with the dark matter tree

  1. “particle physicists who are mystified that astronomers had the temerity of think they can contribute to the dark matter debate”

    This essentially the same as claiming that:

    -particle physicists who are mystified that condensed matter physicists had the temerity of think they can contribute to the room temperature superfluidity debate.

    Or:

    -particle physicists who are mystified that biologists had the temerity of think they can contribute to the evolution debate.

    For a hammer everything looks like a nail and for particle physicists everything can be reduced to elementary particle interactions, particle physicists mindsets are the embodiment of naive reductionism.

    Not by chance P. Anderson wrote More is Different in response to particle physicists smugness and naivete.

    Liked by 3 people

  2. Hi Stacy, Nice post on how the DM options change to stay the same over the decades, I also liked your point on how simply calling it a dark matter problem shifts the focus from what and exactly where the discrepancy appears. In all fairness though, it is as much an acceleration discrepancy as it is a mass discrepancy, the masses are only discrepant when baryonic inferred masses are compared to the masses required by observed accelerations, just as the observed accelerations only become discrepant when compared against standard gravity expectations derived from the luminosity derived masses. But yes, calling it an acceleration discrepancy shines light on the non-inevitability of dark matter as an answer.

    Regarding classifications and nice representations of the problem, you MUST look at the work of a philosopher of science colleague here in Mexico, Mariana Espinosa. I am including here one of her gravity apple trees which I am sure you will like, together with a not so recent paper of hers on the subject. She has now done a lot on categorising mass discrepancy solutions, from a philosophy of science view point, separating the ad hoc from the testable and so on. The image appears also as supplementary material in the paper.

    https://iopscience.iop.org/article/10.1088/1742-6596/600/1/012050

    Best, X.

    Liked by 2 people

      1. Looking at this amazingly elaborate pictorial depiction of gravitational theories since Newton makes one realize that theory development has much in common with biological evolution.

        Like

          1. I was initially thinking in terms of different biological species branching into subspecies, some of which die off while others continue to thrive. This also applies to theories in physics where some concepts turn out not to be viable, while others stand the test of time. I had never heard of Jay Gould’s Punctuated Equilibrium hypothesis, but the words immediately suggested he was referring to species like crocodiles that have maintained their morphological form since the age of the dinosaurs. Checking the Wikipedia entry on this, sure enough, that’s more or less the idea. His model postulates long term morphological stability punctuated by sudden bursts of evolutionary change, in contrast with Darwin’s gradualism with small incremental changes. This definitely has its parallel in science where the scientific community undergoes major paradigm shifts like Quantum Mechanics and Relativity, completely altering our understanding of nature. But it’s too early in the morning to do much thinking on my first Java jolt, so I’ll leave it at that.

            Liked by 1 person

            1. The academics do tend to speciate among their favored models, but the evolutionary record does tend to be a mixture of gradualism and punctuated equilibrium.
              The premise of punctuated equilibrium is the record tends to be relatively stable, with some degree of branching/gradualism, but then there occur breaks, the punctuations. These can be due to any number of reasons, from the meteor/volcanism/whatever, taking out the dinosaurs, to basic resource depletion, changing climate, etc.
              Essentially basic cycles of expansion and consolidation. The equilibrium phase tends to select for specialization and complexity, as every niche is filled and resource used. Then the punctuation selects for adaptability and resilience, as the rapidly changing situation edits out those unable to adapt. Leaving another set of resources to encourage further evolution.
              In current physical theory, certainly every possible niche is being explored, but framed by some essential paradigms that are beyond question. Can the right connections/patches be made within this framework, or not?
              The political fact is those with the most authority to set the agenda are necessarily those most committed to the current framework. The problem this creates, is if the model doesn’t solve the issues it sets out to, younger, less committed individuals will keep pushing at the cracks. Setting up that age old tension between authority/tradition and renewal. Punctuated equilibrium.

              Liked by 2 people

              1. “Leaving another set of resources to encourage further evolution.”

                Keeping mind those first in line tend to set the pattern, thus less inclined to further adaption.

                Like

  3. Someone mentioned to me today that Joe Rogan queried Neil DeGrasse Tyson recently about the viability of the Big Bang Theory, given some of the data coming out of the James Webb. While I didn’t go check it out, he did mention Tyson wasn’t very open minded about it.
    Safe to say, Rogan has a larger, younger audience than Tyson.
    Along with Peter Woit’s most recent posting, a rant about the wormhole in a bottle media hype, it seems to me the cosmology and theoretical physics fields won’t have decades more to debate these issues in relative academic security and obscurity.
    Some of the more obvious issues need to be addressed, such as time is not a physical dimension of events. Cause becomes effect.
    That BBT still explicitly uses lightspeed as the denominator.
    That the entire physics debate has revolved around reductionist atomism for what, 2300 years and it seems the elephant in the room is positive and negative charge.
    The technical term for these sorts of feedback loops is, “drinking your own bathwater.”

    Liked by 1 person

    1. The concept of universe expansion and the big bang itself is likely to stay unscathed, as it is derived from general relativity and MOND-like modifications to general relativity.

      But everything else about the standard cosmological model is likely to change radically in the next few decades, due to all the tensions building up in virtually every other part of the Lambda CDM. I’d see it likely that the cosmological principle itself and the FLRW metric will be discarded, which would mean that the Hubble tension and the S8 tension would get resolved, but would also mean that many things that society currently believes about cosmology will be overturned, and the consequences of a lack of the FLRW metric will have to be taken care of by a new generation of cosmologists. Getting rid of the cosmological principle would also get rid of the need for dark energy, since the “acceleration” could also be explained by inhomogeneities in the universe. Dark matter as a whole is likely dead due to the recent high redshift galaxies discoveries by the JWST, and might be replaced by something like MOND.

      As for theoretical physics, one area that’s probably going to die in the near future is the entire field of quantum gravity research, whether that be in string theory, loop quantum gravity, holographic duality and the AdS/CFT stuff, et cetera. The researchers themselves have admitted that they have run out of original ideals and have been reduced to playing around in low dimensional models and basic simulations of said models on quantum computers. These outrageously misleading publicity stunts like the wormhole stuff done recently are only going to cause the rest of the scientific community and the general public to wonder if this quantum gravity research is useless or even detrimental to science and society, and if their funding should be cut off. In addition, the current funding is also hugely based upon the heavy investment in quantum computing research at the moment; whenever that funding dries up I don’t see the quantum gravity research lasting for much longer.

      Liked by 1 person

      1. So General Relativity is “universally valid”? This is obviously an unsupported assumption.

        Any theory always has a limited complexity range of applicability. Precisely MOND shows General Relativity limited range of applicability, and then MOND itself will also have a limited range of applicability.

        Naive reductionism is the underlying reason for many of these problems. P. Anderson shadow is everywhere.

        Like

        1. I never said once that general relativity/MOND were “universally valid”. I just don’t think that any evidence will pop up in the near future that would contradict general relativity/MOND and overturn the big bang theory, which means that in practice the mainstream astronomers and physicists wouldn’t really have need to consider anything else other than general relativity/MOND. It’s not like the case with cold dark matter or the cosmological principle in Lambda CDM, where there is already fairly big tensions between theory and experiment, and alternatives are being openly discussed amongst mainstream astronomers and physicists.

          We could also talk about naive reductionalism, which I agree to be wrong. Kant makes the point that the existence of spacetime itself is an unsupported assumption, and the Munchhausen trilemma in epistemology states that every field is ultimately reduced down to a collection of unsupported assumptions, so determining truth or falsehood is virtually impossible. What I personally believe is more important than the actual truth or falsehood of a theory is what the larger community believes to be true or false in this ongoing paradigm shift in cosmology. And I see it likely that stuff like dark matter, dark energy, and the cosmological principle going by the wayside, and general relativity possibly replaced with some MOND like gravity theory, but I don’t see any evidence that the Big Bang theory is in threat of being cast to the wayside.

          Liked by 2 people

          1. Thanks for your answer. When I mentioned naive reductionism I was referring to it in the sense of P. Anderson More is Different: higher level structures will limit the applicability range of any theory. Galactic structures is a limit for general relativity, galaxy clusters possibly a limit for MOND and so on.

            Even assuming the universal validity of your assumptions (which never is true) the applicability range of your predictions is limited by complexity. More is different.

            Like

      2. Here is an interesting paper I can across, some years ago, pointing out that while single spectrum light will only redshift due to recession, multi spectrum “light packets” will redshift over distance, as the higher frequencies dissipate faster;

        Click to access 2008CChristov_WaveMotion_45_154_EvolutionWavePackets.pdf

        The problem this would pose, is it would mean we are sampling a wave front, not detecting individual photons, having traveled billions of lightyears. Which raises the question of whether quantification is actually fundamental to the light itself, or an artifact of absorption, detection and measurement.
        Here is a paper that was an entry in a FQXI contest and an interview with Carver Mead, both making that assessment;
        https://fqxi.org/community/forum/topic/1344
        http://worrydream.com/refs/Mead%20-%20American%20Spectator%20Interview.html

        Like

          1. I suppose it is possible, but far beyond my level of technical expertise.
            My only observation regarding the issue is whether we are looking at it backwards.
            Since there is this centripetal dynamic, going from the barest bending of light, to the vortices at the center of galaxies, that seems in inverse proportion to the apparent expansion, which does seem to be largely based on the properties of light, naturally radiating out, that whatever constitutes mass is simply an intermediate effect of this relationship between the structure/form pulling in and the energy radiating out.

            How much are our theories driven by a reductionist atomism, that is simply a conceptual paradigm, a philosophy, rather than actually weighing all possibilities, like sitting back and looking at the big picture, rather than obsess over the details.

            Liked by 1 person

      3. I posted three links in the original reply and it didn’t go through, so here is it again;
        Here is an interesting paper I came across some years ago, making the point that while single spectrum light will only redshift due to recession, multi spectrum “light packets” will redshift over distance, as the higher frequencies dissipate faster;

        Click to access 2008CChristov_WaveMotion_45_154_EvolutionWavePackets.pdf

        The larger problem this creates is it means we are sampling a wave front, rather than detecting individual photons, having traveled billions of lightyears, so the quantification of light is more a function of its absorption, detection and measurement, than fundamental to the light itself.
        The other two links were in support of that.

        Like

        1. This was an entry in a FQXI contest, arguing for a “loading” theory of light;
          https://fqxi.org/community/forum/topic/1344

          “After recognizing dubious assumptions regarding light detectors, a famous beam-split coincidence test of the photon model was performed with gamma-rays instead of visible light. A similar test was performed to split alpha-rays. Both tests are described in detail to justify conclusions. In both tests, coincidence rates greatly exceeded chance, leading to an unquantum effect. This is a strong experimental contradiction to quantum theory and photons. These new results are strong evidence of the long abandoned accumulation hypothesis, also known as the loading theory, and draw attention to assumptions applied to key past experiments that led to quantum mechanics. The history of the loading theory is outlined, including the loading theory of Planck’s second theory of 1911.”

          Liked by 1 person

        2. This is an interview with Carver Mead, from some 21 years ago;
          http://worrydream.com/refs/Mead%20-%20American%20Spectator%20Interview.html

          “Once upon a time, Caltech’s Richard Feynman, Nobel Laureate leader of the last great generation of physicists, threw down the gauntlet to anyone rash enough to doubt the fundamental weirdness, the quark-boson-muon-strewn amusement park landscape of late 20th-century quantum physics. “Things on a very small scale behave like nothing you have direct experience about. They do not behave like waves. They do not behave like particles …or like anything you have ever seen. Get used to it.”

          Carver Mead never has.

          As Gordon and Betty Moore Professor of Engineering and Applied Science at Caltech, Mead was Feynman’s student, colleague and collaborator, as well as Silicon Valley’s physicist in residence and leading intellectual. He picks up Feynman’s challenge in a new book, Collective Electrodynamics (MIT Press), declaring that a physics that does not make sense, that defies human intuition, is obscurantist: It balks thought and intellectual progress. It blocks the light of the age.”

          Like

      4. “Dark matter as a whole is likely dead due to the recent high redshift galaxies discoveries by the JWST…” How do high redshift galaxies discoveries by JWST falsify DM?

        Like

          1. I should add that because we see large galaxies already formed so early it seems to eliminate DM clouds as necessary seeds for structure growth; while enhancing the probability that a modified gravity approach (which speeds up structure growth) is closer to the truth

            Liked by 2 people

            1. The rotational “rigidity” of galaxies(flatness of rotational speed) strongly points to a system property(emergent behavior).

              Stacy McGaugh published result showing a correlation between the radial acceleration traced by rotation curves and that predicted by the observed distribution of baryons reaffirms that idea, making dark matter totally superfluous.

              The dark matter thing is today just a scheme to keep the funding going, nothing more and nothing less.
              https://arxiv.org/abs/1609.05917

              Liked by 1 person

    1. I always compare this with the situation in biochemistry. Here there are also in all living organisms only L-amino acids, although D-amino acids would function in the same way. At the very beginning, L-amino acids won and now dominate biology.

      Like

      1. Yes, you can create hand-waving explanations for how that came about; your assumption would mean that if we could look at all the carbon-based forms of life in the universe, half of them would be based on L-amino acids and half on D-amino acids. There isn’t any corresponding reason for the distribution of galaxies to be chiral.

        Like

        1. We have
          1. matter : anti-matter = 1 : 0
          L-amino acid : D-amino acid = 1 : 0
          2. there is no reason to prefer matter or L-amino acids.
          The world would work just as well with anti-matter or D-amino acids.
          It could be that the same mechanism works on both.
          One can look for it…

          Like

  4. Here is how ChatGPT responded to the question, “Explain why dark matter probably does not exist:”

    “Dark matter is a hypothetical form of matter that is believed to account for a large portion of the mass in the universe. It is called “dark” because it does not interact with light or other forms of electromagnetic radiation, making it difficult to detect.

    “However, there is no direct evidence for the existence of dark matter. Scientists have only inferred its existence based on the gravitational effects it would have on visible matter. There are also alternative explanations for these gravitational effects, such as modifications to the laws of gravity.

    “Additionally, recent observations from the Large Hadron Collider and other experiments have failed to detect any signs of dark matter particles, further casting doubt on its existence.

    “Therefore, it is likely that dark matter does not exist, and that our current understanding of the universe and its constituents needs to be revised.”

    Perhaps by asking ChatGPT a sequence of such questions, one could produce a decent review article.

    I promise never to do this again.

    Liked by 1 person

  5. could clusters of galaxies be informed by a length based modify gravity ?

    also

    Some dwarf galaxies are within the Mondian regime at all radii, i.e., the gravitational acceleration provided by the observed baryons is always below the threshold of g†≃1.2×10−10ms−2.Here we show that under MOND gravity, the existence of this core implies a baryon content whose density gbar must decrease toward the center of the gravitational potential (gbar→0 when r→0). Such drop of baryons toward the central region is neither observed nor appears in numerical simulations of galaxy formation following MOND gravity. We analyze the problem posed for MOND as well as possible workarounds.

    Cite as: arXiv:2209.12547

    Like

    1. To explain clusters, one could imagine modifying MOND. Zhao & Famaey have tried with eMOND, which depends on potential well depth as well as acceleration. Whatever one does has to start from looking like MOND, and can’t break that as most length-scale dependencies that I can imagine would do.

      Liked by 1 person

  6. The tree is missing some branches, not necessarily superior to the others, but not to be ignored entirely.

    1. Underestimated GR effects.

    As you well know, most astronomy modeling at galaxy scale and above is done with Newtonian gravity ignoring GR effects. Some of those effects are undoubtedly small and proper to ignore. But some may be ignored due to the ways that GR effects are approximated or modeled. There are quite a few papers pursuing this possibility. https://dispatchesfromturtleisland.blogspot.com/2022/07/another-effort-to-explain-dark-matter.html See, e.g., Deur’s very good CMB fit without DM and he claims without gravity modification. A. Deur, “Effect of the field self-interaction of General Relativity on the Cosmic Microwave Background Anisotropies” arXiv:2203.02350 (March 4, 2022).

    2. Quantum gravity effects. This is a close sibling of underestimated GR effects.

    The differences from GR may be subtle, but they might add up in large systems rather than purely cancelling out. Like underestimated GR effects, however, this isn’t really “modified gravity.”

    3. Modeling errors and systemic errors.

    These have been important in lots of past anomalies from the superluminal neutrinos, to the CDF W boson mass anomaly, to the muonic hydrogen radius issue, and quite possible to the muon g-2 anomaly where the data fits one SM calculation but not another, and the cosmic ray muon excess anomaly. See The Pierre Auger Collaboration, “Measurement of the fluctuations in the number of muons in extensive air showers with the Pierre Auger Observatory” Accepted for publication in PRL arXiv:2102.07797 [hep-ex] (February 15, 2021). The reactor neutrino anomaly is yet another. F.P. An, et al., “Evolution of the Reactor Antineutrino Flux and Spectrum at Daya Bay” (April 4, 2017).

    This could be a source of at least some of MOND’s cluster problems. See M. Lopez-Corredoira, et al., “Virial theorem in clusters of galaxies with MOND” arXiv:2210.13961 (October 25, 2022) (accepted for publication in MNRAS) (arguing that MOND performs better when the galaxy clusters are more realistically modeled).

    This is much more of an important possibility with respect to dark energy where the data supporting the current predictions is thinner and less diverse (and hence less robust) – something the Hubble tension is casting a spotlight upon. See, e.g., Ritesh Singh, “Evidence for possible systematic underestimation of uncertainties in extragalactic distances and its cosmological implications” arXiv:2111.07872 (November 15, 2021) (published in 366 Astrophys Space Sci 99 (2021) DOI: 10.1007/s10509-021-04006-5); S.L.Parnovsky “Bias of the Hubble constant value caused by errors in galactic distance indicators” arXiv:2109.09645 (September 20, 2021) (Accepted for publication at Ukr. J. Phys); Young-Wook Lee, et al., “Discovery of strong progenitor age dependence of type Ia supernova luminosity standardization process and discordance in cosmology” arXiv:2017.06288 (July 13, 2021) (submitted to Apj); Roya Mohayaee, Mohamed Rameez, Subir Sarkar, “Do supernovae indicate an accelerating universe?” arXiv:2106.03119 (June 6, 2021); J.T. Nielsen, A. Guffanti an S. Sarkar, “Marginal evidence for cosmic acceleration from Type Ia supernovae” 6 Scientific Reports 35596 (October 21, 2016) (open access).

    Also, quite relevant to ruling out the last corner of baryonic dark matter theories that people may have missed is Amir Sirajh, Abraham Loeb, “Eliminating the Remaining Window for Primordial Black Holes as Dark Matter from the Dynamics of the Cold Kuiper Belt” arXiv (March 8, 2021).

    Liked by 1 person

    1. Let me reply in reverse order:
      3. Systematic errors are the bane of astronomy, and happen all too often. One has to get it right not to be misled. That is a category error of a different type than depicted in my diagram, so has no place there.
      2. Sure – some of these effects might be cause/explained by a quantum theory of gravity. It would be great if someone were to develop a theory that unifies quantum and GR effects while also producing the observed MONDian effects.
      1. No. I will not go into this in great detail here, but “underestimated GR effects” gratuitously fail to explain the observations that we attribute to dark matter. The simple reason is that relativistic corrections that’d be relevant here scale as (v/c)^2. A big galaxy might have v=300 km/s, but c=300,000 km/s so these are a one in a million effect. The acceleration discrepancy is orders of magnitude larger than that, and gets bigger in lower surface brightness galaxies, which also tend to have lower v, so there is even less of a relativistic effect.

      Liked by 1 person

      1. I don’t have the expertise to say, but there is a significant published literature making the claim. G. O. Ludwig, “Galactic rotation curve and dark matter according to gravitomagnetism” 81 The European Physical Journal C 186 (February 23, 2021) (open access); F.I. Cooperstock, S. Tieu, “Galactic dynamics via general relativity: a compilation and new developments.” 22 Int. J. Mod. Phys. A 2293–2325 (2007). arXiv:astro-ph/0610370; H. Balasin, D. Grumiller, “Non-Newtonian behavior in weak field general relativity for extended rotating sources.” 17 Int. J. Mod. Phys. D 475–488 (2008); M. Crosta, M. Giammaria, M.G. Lattanzi, E. Poggio, “On testing CDM and geometry-driven Milky Way rotation curve models with Gaia DR2.” 496 Mon. Not. R. Astron. Soc. 2107–2122 (2020); W.M. Stuckey, Timothy McDevitt, A.K. Sten, Michael Silberstein, “The Missing Mass Problem as a Manifestation of GR Contextuality” 27(14) International Journal of Modern Physics D 1847018 (2018). DOI: 10.1142/S0218271818470181; Federico Re, “Fake dark matter from retarded distortions” (May 30, 2020); Leonardo Modesto, Tian Zhou, Qiang Li, “Geometric origin of the galaxies’ dark side” arXiv:2112.04116 (December 8, 2021); Felipe J. Llanes-Estrada, “Elongated Gravity Sources as an Analytical Limit for Flat Galaxy Rotation Curves” 7(9) Universe 346 arXiv:2109.08505 (September 16, 2021) DOI: 10.3390/universe7090346; P. Tremblin, et al., “Non-ideal self-gravity and cosmology: the importance of correlations in the dynamics of the large-scale structures of the Universe” arXiv:2109.09087 (September 19, 2021) (submitted to A&A, original version submitted in 2019); Yogendra Srivastava, Giorgio Immirzi, John Swain, Orland Panella, Simone Pacetti, “General Relativity versus Dark Matter for rotating galaxies” arXiv:2207.04279 (July 9, 2022); Ali Kazemi, Mahmood Roshan, Elham Nazari “Post-Newtonian corrections to Toomre’s criterion” (August 17, 2018) (accepted in ApJ); Alexandre Deur, “Relativistic corrections to the rotation curves of disk galaxies” (April 10, 2020) (lated updated February 8, 2021 in version accepted for publication in Eur. Phys. Jour. C) (with classical GR); A. Deur, “Implications of Graviton-Graviton Interaction to Dark Matter” (May 6, 2009) (published at 676 Phys. Lett. B 21 (2009) (with quantum gravity inspirations).

        Some are claiming GEM effects which I don’t think get you there, but some are not and are looking at effects not accounted for in post-Newtonian approximations.

        See also criticizing S. Deser’s downplaying of the role of gravitational self-interactions: A.I. Nikishov of the P.N. Lebedev Physical Institute in Moscow states in an updated July 23, 2013 version of an October 13, 2003 preprint (arXiv:gr-qc/0310072); A.L. Koshkarov “On General Relativity extension.” (arXiv:gr-qc/0411073) (November 4, 2004); Alexander Balakin, Diego Pavon, Dominik J. Schwarz, and Winfried Zimdahl,”Curvature force and dark energy” New.J.Phys.5:85 arXiv:astro-ph0302150 (2003); Hong Sheng Zho, arXiv:0805.404 (2008); K. Kleidis and N.K. Spyrou, “A conventional approach to the dark-energy concept” (arXiv: 1104.0442 [gr-qc] (April 4, 2011).

        Deur’s argues that gravitational self-interactions become material in GR at GM/size(system) (with units of length) which is approximately equal to 10^-3, which is about the value of binary neutron stars and galaxies, but far greater than, for example, wide binary star systems, the solar system, or gravitational interactions between components of an atom.

        Like

      2. The point is that I don’t think that this literature is any more speculative or off the mark than lots of other concepts in the tree (e.g. Verlinde’s Emergent Gravity). Most of the branches are going to be wrong, but if there is a serious published scientific effort to explain a phenomena it seems like it ought to be a branch on the tree.

        Like

      3. At ohWilleke’s point 2 and Stacy’s reply: I understand for Deur’s self-interacting gravity approach that the decision of whether it matches MOND is hard. Self-interacting force is a tough subject for checking the calculations, let alone doing them (Deur’s contributions are indeed very valuable IMO). The force carriers are attracted by each other, while attracting mass as well, moving around and what not.

        Moreover, it’s hard to predict the effect of every different mass distribution in a system: spherically symmetric seems to be different from disc-like and from two mass points and so on. Even if you blindly trust Deur’s calculations, it’s quite difficult to judge and compare with reality!

        But promising, sure 🙂

        The situation of two significant isolated point masses seems the strangest to me, force regardless of distance? However, IMO the string theory explanation for black holes (that they might be in fact fuzzballs) reduces that problem, since isolated pairs of point masses do not really exist then. Perhaps this will sometime be measurable in gravitational wave patterns?

        Like

  7. “… acceleration discrepancy. That’s what we measure, after all.”
    Not quite, we measure a velocity. We don’t have enough time to see a change of velocity. `Acceleration’ is also an inference, and a sloppy one because it comes from only one measured component of the velocity.

    There seems to be more evidence for a `velocity discrepancy’: asymptotic V flatness, Tully-Fisher M vs V, V^2 virialized clusters of galaxies, Renzo’s light and V, central density relation dV/dR, and even the ‘recession-Velocity-measured H_0 tension’ on the cosmology side…

    The `acceleration discrepancy’-language used by MOND is halfway between `mass discrepancy’ and `velocity discrepancy’ – that may be why it works better than Dark Matter.

    Liked by 1 person

    1. ‘Acceleration discrepancy’ was Bekenstein’s term, and comes closer to the mark than either ‘mass discrepancy’ or ‘dark matter problem.’ If you want to be super-pedantic about it, we don’t even measure velocity. We measure shifts in the wavelength of known spectral lines that we interpret to be velocities via the Doppler effect. The centripetal acceleration required to keep stars on their orbits in galaxies is V^2/R, and we measure R just as we measure V, so I do not accept that this is any less of a measured quantity. Moreover, the discrepancy consistently appears at a particular acceleration scale: 1E-10 m/s/s. Well above this scale, no discrepancies are observed. Below it, they are. This happens irrespective of any other scale, be it velocity or length or frequency or what have you.
      The acceleration a is related to surface density S by a = k*G*S where G is Newton’s constant and k is a geometrical constant (k=1 for spheres so a = G*S = GM/R^2). I was studying low surface brightness galaxies, so I first recognized emiprically that something important was going on with surface density, and in initially interpreted this to mean that lower surface brightness galaxies were more dark matter dominated for reasons unknown. It wasn’t until later that I realized that Milgrom had predicted exactly the behavior I was observing because acceleration and surface density are tied together.

      Liked by 1 person

      1. I like such explanations.
        They are short and concise. A 12 year old or at least a high school student can follow these explanations. You will attract the good and truthful ones to your side.
        In particle physics Michael Kobel is capable of such short and concise explanations.
        I like such explanations very much.

        Liked by 1 person

        1. I agree. Though late to the party (going on 78), and have only become aware of & interested in MOND, and quantum “stuff” in just the past 5yrs, Stacey’s reasoning
          & explanations are compelling (though challenging for one who has just a BS
          in microbiology with basic physics & math).

          Like

      2. “The acceleration a is related to surface density S by a = k*G*S where G is Newton’s constant and k is a geometrical constant (k=1 for spheres so a = G*S = GM/R^2).”

        But the geometry of a disk galaxy is not spherical except near the core so k can’t be a constant; the further out the disk you measure, the more any gravitationally based relationship should approach 1/R, shouldn’t it? Which is what MOND accomplishes mathematically while, it seems to me, obscuring the relationship of the observations to the physical geometry of the system.

        Liked by 1 person

  8. A bit off-topic, but you may find it interesting that there is a video on Youtube with over 1 million views called “How James Webb Broke Cosmology In Just 2 Months”, which looks very interesting, mentions MOND in favorable way (“new data seem to contradict Standard Model but fit MOND”), and overall look balanced and professional.

    The title is a bit click-baity, but otherwise this video seems to be a rarity in this age of Big Hype.

    Liked by 1 person

    1. Thanks.
      Though they don’t seem too close to giving up on it yet.
      That’s a mighty big shoehorn.
      While I get dismissed for pointing this out whenever possible, the fact remains they still use lightspeed as the denominator, while insisting it isn’t.
      Presumably this is not an expansion in space, given redshift increases proportional to distance in all directions. Which would mean that either we are at the exact center of the universe, or redshift is an optical effect. “Tired light.”
      So the argument is that space itself expands, based on General Relativity. That somehow “space” expands, meaning light takes longer to cross it.
      Which should raise two very basic questions; 1)What is lightspeed measuring, if not intergalactic space? It is the most basic cosmic ruler we have, but apparently when it comes to intergalactic space, it’s just an inchworm crawling on the balloon.
      2) Isn’t the central premise of GR and spacetime, that the speed of light is a Constant in any frame? So if the frame/intergalactic space, is expanding, wouldn’t the speed of light have to increase, in order to remain CONSTANT???
      Instead we have two distinct metrics being derived from the same light. One based on the speed and the other based on the spectrum. If the speed were the numerator, it would be a “tired light” theory, but as an “expanding space” theory, lightspeed is still the denominator. The metric against which this expansion is being calibrated.
      Can anyone point out where the expansion isn’t measured in terms of the speed of light?
      Obviously I don’t seem to be able to get too many other people to see the basic logic, but at least I don’t need a 10 billion dollar telescope to know the BBT is nonsense.

      Liked by 1 person

      1. “Which would mean that either we are at the exact center of the universe, or ”
        This is wrong.
        And if I should take you seriously, you should be able to justify why….
        And if Madeleine is supposed to take you seriously, you should at least be able to provide one reason why “tired light” is no longer taken seriously today…

        Like

        1. “This is wrong.”

          Can you otherwise explain why basic doppler shift was deemed insufficient? That cosmic redshift is a function of simple increasing distance, in otherwise stable space?
          (FYI, when the train moves down the tracks, it doesn’t stretch the tracks. This is a stable metric, in which the doppler effect is caused by the increasing amount, of this yardstick.)
          Why did it become argued that space itself is expanding?
          Redshift does increase proportional to distance in all directions and if you can’t figure out this would create the effect that if normal doppler effect was the reason, we would have to be at the center of this expansion, I’m not sure how to explain it to you.
          (Safe to say, in my own teenage days, back in the 70’s, this was the reason given, not something I figured out, but having followed the subject for decades, it does seem the original descriptions of the issue became fudged over. The basis for “Dark Energy” is another issue where the original, rather clear explanation has become increasingly distorted by some media game of Chinese whispers. So I assume you are quite a bit younger.)
          Einstein said, “Space is what you measure with a ruler.” What is the “ruler” in this theory, if not the speed of light? Yet presumably “space” expands, but the speed of light remains stable.
          Isn’t the basis for spacetime the fact the speed of light remains Constant in all frames?

          “Tired light” was dismissed because the assumption required some medium to slow it and there were no other distortions.

          Liked by 1 person

  9. The difficulty is the assumption of a “very-nearly homogeneous” early universe. Why is there something rather than nothing? It might be resolved with nothing rather than something. I know that idea goes against common understanding–the assumption: something. Energy divided by the speed of light equals mass times the speed of light. Light relative to itself is everywhere at once. Space doesn’t exist. Time doesn’t exist. Work is an illusion of quantum observation. I’m speaking third person objectively of course. Infinity divided by zero equals 1. I’m hoping somebody gets it and can plug it into a theory. It seems to be beyond my ken, except I have noticed that as the universe is getting bigger The Observer is getting smaller. Perhaps the universe isn’t expanding. Perhaps there was no Big Bang. Perhaps the black hole is in stasis and the Creator is disappearing into the Blue. I wonder about these things.

    Like

    1. The thing about the “nothing” of space is that 3 dimensions are really just a mapping device, like longitude, latitude and altitude. The two qualities this “nothing” has are infinity and equilibrium. Which is implicit in the fact that the frame with the longest ruler and fastest clock is closest to the equilibrium of the vacuum. The unmoving void of absolute zero. So space is like the number line, from zero to infinity.
      As I’ve observed about time, as mobile organisms, we have this sentient interface between body and situation that functions as a sequence of perceptions, in order to navigate, so our concept of time is as the point of the present, moving past to future. Though the evident reality is change turns future to past. Tomorrow becomes yesterday, because the earth turns.
      There is no physical dimension of time, because the past is consumed by the present, to inform and drive it. Causality and conservation of energy. Cause becomes effect.
      So “energy” is conserved, because it manifests this presence, creating time, temperature, pressure, color and sound. Time is frequency, events are amplitude.
      Energy, as present, goes past to future, because the patterns it generates coalesce and dissolve, future to past. Potential>actual>residual. Energy drives the wave, the fluctuations rise and fall.
      The physical path of least resistance for the energy is to expand toward infinity, while the path of least resistance, being simply this pattern generation, is to collapse toward equilibrium.
      So we have galaxies. Between black holes and black body radiation.
      Now what this energy is, is another question, but it projects the opposite direction from the information it generates. It should also be considered that consciousness goes past to future, while the perceptions, emotions and thoughts giving it form and structure go future to past. Though it’s the gut processing the energy, while the head sorts the information.

      Like

      1. Hmm. Not sure your reply attacked the problem directly. The three dimensional world is you, me and them. First, second, and third persons. If everything is third person, of which you and I are a subset, we miss the boat. Observation is always first person. Everything we see is first person. Height width depth duration carry the future to past as everyone knows who has a memory. All rules of physics work backwards and forwards. The question is a matter of time. If time is set out as a third person equation it leads to a conundrum which cannot be resolved. I’m still working on the conundrum.

        Like

        1. You, me and them are points on a graph. The problem we seem to have is seeing beyond that surface level.
          Is time really the dimension of events, where the “now” is as subjective as a point in space and all events physically exist out on the timeline and with the right application of mathematical faerie dust, we can time travel through wormholes in the fabric of “spacetime?”
          Or is there a physical dynamic, manifesting this physical presence, creating and dissolving these events? Cause becoming effect.
          The real reason different clocks can run at different rates is simply because they are separate actions. Think metabolism. Culture is about synchronizing society into one larger organism, based on the same languages, rules and measures, so the idea of a universal flow of time is natural, but nature is so diverse and yet integrated, because everything doesn’t march to the beat of the same drummer. Multicultural, rather than monoculture. The field harmonizes, while the entities synchronize.
          Time is asymmetric, because it is a measure of action, not just a graphing line and action is inertial. The earth only turns one direction.
          That different events will appear in different order from different locations is no more consequential than seeing the moon as it was a moment ago, simultaneous with seeing stars as they were years ago. It’s the energy that’s conserved, not the information. That the information changes is time.
          Math is abstraction, meaning it’s abstracted from the deeper reality, not the basis for it. Signals in the noise. Map, not territory. If you tried to include all the information from the territory into the map, it would revert back to noise. So in order to be useful, maps are edits. If you don’t take that into account, you end up with a very distorted view of the reality.

          Like

        2. Heat introduces empirical directionality. So not all of physics is time symmetric. Time symmetry is an idealization of “identity in time.” This is problematic since “identity” as it pertains to the truth of statements is static (as “truth” is commonly perceived to be “eternal”).

          Emile Meyerson’s book “Identity and Reality” had been written just at the time relativity appeared and before quantum strangeness had impinged too strongly. It contains a comprehensive survey of the historiography of causal reasoning. It is from that work that I learned the significance of Carnot’s contribution.

          This is one reason we speak of “heat death” for the universe (Rovelli, in particular?)

          Like

          1. Heat, as energy, does disperse. The end result would seem to be black body radiation.
            Yet, if the universe is infinite, heat lost from one area is always replaced by heat radiating in from surrounding areas.
            Then there seems to be this opposite effect, where form/information creates a centripetal reaction/blowback. That spiraling into the center of what are galaxies. While there are black holes at the center, it would seem anything that actually falls into them gets shot out the poles as quasars.
            Consider that lasers are synchronized lightwaves and quasars are really big lasers. It seems to me something really enormous is being overlooked here.
            For one thing, gravity.
            What do we really know about gravity? All that can really be said is that it’s a centripetal effect. One which goes from the barest bending of light across intergalactic space, to the vortices/black holes at the center of galaxies. Yes, personal experience and Newton say it’s a property of matter. Apples fall down.
            Lots of efforts is being expended find the missing mass, but it would seem equally possible that it’s the other way around. This stable, reasonably dense state is an intermediate stage and feedback loop of that centripetal effect interacting with the opposing heat/expansion.
            So is the basic dynamic of synchronization something that occurs as a fundamental wave behavior? That once form is generated, it tends to coalesce as a path of least resistance. One big wave is more efficient than many small ones.
            Entropy of form falling in, as entropy of energy is radiating out.
            Synchronization versus harmonization.

            Like

            1. I apologize brodix. I guess I had been responding to Carpenter’s statement about symmetry –not your statements.

              Importantly, though, Madeline Birchfield’s mention of both Kant and the trilemma apply to casual counterfactual reasoning. So, positing a steady state for an unwitnessably infinite universe is dubious.

              One place infinity enters physics is through the law of inertia. Perpetual motion machines may be impossible, but inertia classifies stateful behavior in perpetuity. By Einstein’s own account, general relativity is not a geometrization of physics. He viewed it as a unification of gravity with inertia. Others study it in terms of geometry.

              Inertia may be compared to the use of an infinite tape by Turing or the implicit assumption of an infinity to describe winning strategies in game theory. It is what we use to assert that something can always be assumed. So, it relates to our sense of time.

              The conception of time as an unwitnessable algebraic dimension then becomes suspect. However, the utility it provides for calculation and apparent prediction places intense constraints upon any mere philosophical objection to its use.

              If you wish to consider time from subjective witnessability, I would suggest that you look at the work of Kari and Kulic on Wang tiles and the extension to the cubic honeycomb in three dimensions.

              These are aperiodic labelings of the respective honeycombs with arithmetical labels. At the very least, our three dimensional experience of a lifetime is aperiodic. So, their work meets an apparent first criterion. In addition, tbe tilings are numerologically related to things like Kuratowski’s 14 set problem and the uniqueness of the 21-point projective plane related to certain symmetries quantum gravity researchers had found interesting.

              Too much math belief.

              Science is caught in a dilemma represented in free logics. To speak speculatively, one must be able to speak of objects which do not exist. This corresponds with positive frre logic. To deny fairy tales and speak only of existing things, one needs the principle of indiscernibility of nonexistents from negative free logic. And, to speak hypothetically at all, one cannot rely on the philosophies of mathematics which arose because of atheist philosophers who tried to use mathematics and science as truths justifying their beliefs. Those philosophies insist upon every logic have a semantic interpretation.

              One issue with counterfactuality is that it has been subsumed under modality. The symmetry associated with negation carries an infinite regress. Quantum logic, anyone?

              Modality, in general, relates necessity to continuity. It should not be surprising that physics has discovered its relation to a statistical ground. Homeomorphism is the stronger invariant.

              Science is hard. And, it is made harder when the difficult work of mathematicians in the field of logic is ignored by people confusing belief with truth.

              Like

              1. Thanks for the reply. I will state as clearly as possible that I have little physics, or math background. I started studying physics as a way to make sense of basic psychology, sociology and history, because it seemed like there are a lot of unrecognized patterns and processes going on, where people would instead wander off into the weeds of infinite detail, then try to understand the dynamic. Then I find the same situation in the sciences and math, in spades.
                Complexity is itself emergent, yet reverse engineering it doesn’t work, because most information is lost in the process. Like trying to reconstruct a forest out of the ashes, after the fire goes through. So my sense isn’t to look into the details, but to understand the dynamics. The multitudes of cycles building up and breaking down.

                It would seem to me infinity is inherently, not so much stable, as that implies some physical construct, but a state of general equilibrium. The flatline between the ups and downs. How big is Mount Everest, relative to the distance to Alpha Centauri? Any definition is, by definition, finite. So it’s a matter of scale.

                What is an “object?” I’ve always had a very contextual point of view and to me any object, physical, speculative, metaphorical, social, etc. exists as a node in a network. There are positive feedback loops building them up and negative feedback loops tearing them down. Like a wave, rising and falling, no matter how complex the details.

                It does seem that many of the objects in our current situation are increasingly speculative, such as various crypocurrencies, political ideologies, financial balance sheets, etc. Basically foam and bubbles, as the larger waves crest. The information age has turbocharged the process.
                People seem to be most intoxicated at the crest of the wave, so our speculative impulses tend to be strong. On a personal note, I grew up in the horse racing business, so I’ve seen it on an industrial scale.
                It does seem the mother of all reality checks is in the mail. Belief about meet truth.

                Like

  10. Modern cosmology is awash in discrepancies, not just those discussed here. The reason for this is straightforward; the standard model of cosmology is itself discrepant with empirical reality.

    Empirical reality does not contain a Big Bang event with its inexplicable original condition, an inflation event, a homogeneous and isotropic mass distribution, expanding or curving spacetime, dark matter, or dark energy. Those things only exist in the model, which either assumes their existence, or requires their existence in order to reconcile the model’s discrepancies with physical reality.

    The standard model assumes that the Cosmos can be accurately modeled as an expanding, homogeneous and isotropic gas bag by solving the field equations of General Relativity for a “universal”, classical (non-relativistic) metric. The resulting FLRW equations are the fundamental mathematical basis for a belief in the “expanding universe” model that purports to have knowledge of a simultaneously existing “universe”.

    The simple fact that the speed of light has a finite maximum of @ 3×10^8 m/s argues against the simplistic belief that such a vast simultaneous “universal” entity can possibly exist in physical reality, not to mention the fact that such an entity could not possibly be observed measured or detected from any three dimensionally located position in the observed Cosmos. The idea that the Cosmos is a Universe is therefore discrepant with established physics.

    The fever tree of discrepancies that the standard model has produced over the years and continues to churn out regularly should have disqualified it as a scientific model years ago. Unfortunately modern cosmology has devolved into a cult of belief. As it exists now, wallowing in circular logic and grandiose self-delusion, cosmology is not a science.

    Liked by 1 person

    1. Budrap,

      I think you raise the core misconception. The Universe. It goes to some of our most ingrained assumptions about reality. First that it is an entity, because the object is primary, from atoms, to individuals, to monotheism. What they do is secondary.
      Now it’s strings and the progress has stopped, because they can’t extend it beyond that, to some even smaller form of particle. Though even the idea of the singular universe has broken into multiverses.
      It isn’t just blindness, but the nature of culture, which goes from language to religion. Once minds have absorbed the basics, the details follow, like simple programs that lead to complex results. We might sense something is missed, but as social creatures, as soon as we move away from the direction of the group, the life, the language, the possibilities become more constrained. When we look at history, a few decades seem inconsequential, but to an individual life, those first thirty years have pretty much defined who we have become. If we are part of the crowd, we have to adhere to its models, but if we go our own way, there isn’t the feedback and support. Either we live in our own bubble, or we come to terms with the ways of nature and learn them the best as we can.
      We are the energy flowing into our frame and the world sees us as the energy flowing out.
      Nodes in networks. Synchronization pulling in, as harmonization levels out.

      Like

        1. That has been widely reported, but the question stand: Is there any fundamental difference in the observed objects with high red shifts compared with the objects with low red shifts?

          If there is no essential difference then the Big Bang idea is obviously wrong and everything in between.

          Like

          1. > If there is no essential difference then the Big Bang idea is obviously wrong and everything in between.

            The issue is with galaxy formation and dark matter, not the big bang theory as a whole. Everything is done assuming cold dark matter in their models of galaxy formation; if those are incompatible with the existing data, then those models would have to be redone with cold dark matter replaced with something else. In particular, back in the 1990s Bob Sanders showed that having these massive galaxies this early (300 million years) is compatible with MOND theories. And MOND is compatible with the big bang.

            https://arxiv.org/abs/astro-ph/9710335

            Other parts of the Lambda CDM model has their issues as well, such as the FLRW metric and the assumption of the cosmological principle, due to other tensions (such as the Hubble tension, S8 tension, various cosmological dipoles, the KBC void, et cetera), but unless astronomers start detecting objects at around redshift 100 or so, we are far away from disproving the big bang theory as a whole.

            Like

            1. Thanks for your answer, so the observed red shifts are not high enough to start questioning the Big Bang proper, then we need better telescopes pronto.

              Somehow I got the feeling that we’ll see almost the same at any red shifts.

              Like

            2. @Madeleine
              Nice comment. Very nice comment.
              Somewhere on the internet you wrote that already the water molecule is quantum mechanically unsolvable, because it is a 3-body problem. I found that very convincing. (As opposed to, “In principle, we can calculate everything from the Schrödinger equation.” sounds like a reductionist creed and not very convincing).

              Like

              1. > Somewhere on the internet you wrote that already the water molecule is quantum mechanically unsolvable, because it is a 3-body problem. I found that very convincing. (As opposed to, “In principle, we can calculate everything from the Schrödinger equation.” sounds like a reductionist creed and not very convincing).

                I don’t think I wrote that comment anywhere on the internet. It might have been somebody else. But in general I agree with the sentiment.

                Like

              2. Ah yes, that was me. I was not aware that my comment on Disqus ended up being published in Quanta Magazine until today.

                Like

            3. Madeleine,
              Since you seem literate, objective and still assume the Big Bang Theory to be viable, could you answer the one big issue I have with it. Since I posed this problem back on the 8th, I’ll cut and paste it here;

              “While I get dismissed for pointing this out whenever possible, the fact remains they still use lightspeed as the denominator, while insisting it isn’t.
              Presumably this is not an expansion in space, given redshift increases proportional to distance in all directions. Which would mean that either we are at the exact center of the universe, or redshift is an optical effect. “Tired light.”
              So the argument is that space itself expands, based on General Relativity. That somehow “space” expands, meaning light takes longer to cross it.
              Which should raise two very basic questions; 1)What is lightspeed measuring, if not intergalactic space? It is the most basic cosmic ruler we have, but apparently when it comes to intergalactic space, it’s just an inchworm crawling on the balloon.
              2) Isn’t the central premise of GR and spacetime, that the speed of light is a Constant in any frame? So if the frame/intergalactic space, is expanding, wouldn’t the speed of light have to increase, in order to remain CONSTANT???
              Instead we have two distinct metrics being derived from the same light. One based on the speed and the other based on the spectrum. If the speed were the numerator, it would be a “tired light” theory, but as an “expanding space” theory, lightspeed is still the denominator. The metric against which this expansion is being calibrated.
              Can anyone point out where the expansion isn’t measured in terms of the speed of light?”

              The most common response I get is to go read the textbooks, which I have and they blur over this point. It’s like we have this great theory, but it only works if 1+1=5, so let’s just assume 1+1=5 and not worry about it.
              When the evidence is forced to fit the model, rather than the model being a useful approximation of the evidence, the tool has become the god. It is no longer science, but ideology, with the model as the idol. Be it monotheism, capitalism or BBT.
              Given all the various patches, beside this one, from Inflation to Dark Energy, it would seem pretty evident this is the case, yet the human predilection for running with the crowd is again overwhelming any logical facilities.
              I once had a Catholic priest and future in-law cross himself and walk away, after a debate over the nature of the trinity, where my observation was it seemed to be an analogy for the absolute, the actual and the infinite, with the actual a feedback loop between absolute and infinite. This is the sense I get when trying to raise this topic. People just seem to cross themselves and walk away.

              Like

              1. brodix,

                The block universe has no preferred reference frame. Measurement from within a reference frame permits one to use lightyears as a fixed length. The change in wavelength is attributed to the “why” of how patterns of spectral lines from light sources are shifted with respect to different distances within a reference frame. As there is no reason to postulate this to be different in any other reference frame, it is assumed to be a phenomenon common to all reference frames.

                I am not a physicist, so this may be erroneous in some way. But, your question was not being answered.

                Liked by 1 person

            4. “…we are far away from disproving the big bang theory as a whole.”

              It is not the job of science to disprove metaphysical conjectures like the big bang model. The big bang event is a creation myth; it is, in terms of the model, unobservable, undetectable and in terms of known physics, has an inexplicable original condition.

              The big bang event is a direct consequence of the expanding universe model which rests on two assumptions, 1) that the Cosmos is a unified, coherent, and simultaneously existing entity, an 2) the cosmological redshift is some kind of Doppler shift indicative of a recessional velocity. Neither assumption is supported by any direct empirical evidence, just a long series of circular arguments resting on the truism that the model can be retro-fitted to any novel (and unpredicted) observations.

              The expanding universe model is a metaphysical conjecture in the sense that none of its attendant assertions, principles and auxiliary hypotheses are part of empirically observed reality. You seem to see clearly that the resulting LCDM model is unsustainable, so I don’t understand why you think the foundational expanding universe paradigm is nonetheless valid, especially since it leaves you saddled with the the absurd big bang creation myth.

              Like

              1. The obvious contradiction that I see is that general relativity already fails modeling galaxies rotational speed, hence the fictional dark matter was introduced, then using GR beyond that scale is obviously wrong, even less at “Universe” scale complexity.

                All theories have a limited complexity range of applicability, ignoring that is naive reductionism. Like the people claiming that the Schrodinger equation can describe “in principle” a living being.

                Like

          2. I like Madeleine’s reply, but let me add that galaxies at high redshift do look younger than comparable galaxies nearby. Nearby galaxies similar to the Milky Way are composed of a mix of stars of all ages, from just formed to ancient stars about 13 Gyr old. The spectra of high-z galaxies look like young stars, with ages consistent with having formed within the first few hundred years of the Big Bang. This is surprising in terms of the LCDM structure formation paradigm – it was expected to take longer – but in no way contradicts the Big Bang. Indeed, that all the ages are consistent, and that we never see stars that are 20 or 30 or 100 billion years old (yes, we could tell, both through stellar evolution and the abundances of long-lived radioisotopes), does imply a beginning in time about 14 Gyr ago.

            Liked by 2 people

            1. “The spectra of high-z galaxies look like young stars, with ages consistent with having formed within the first few hundred years of the Big Bang. ”
              Not the first few hundred-thousand years?

              Like

            2. If young stars and galaxies are brighter than their older counterparts, then at the limits of our observational equipment we should expect to see a mostly younger population, no?

              Liked by 1 person

              1. The problem is that we only see young systems and not one or two. If the universe is static, you’d expect more or less a normal distribution by age of galaxies. If your equipment is biased to see only the young galaxies, then you’ll see only a tail of the dustribution. So only a few.

                Like

    1. Dear Mrs. Birchfield

      Thanks a lot for this reference ! And also to Dr. Mac Gaugh for his blog of course.

      Best,
      Maurice

      Like

  11. Hi Stacy, just started reading your blog. I’m not a physicist, but a mathematician who moved from academia to a data science-type role in the private sector.

    I’ve been curious about the dark-matter-vs-other-explanations (in particular MOND) debate for years.

    When I mentioned the topic to a friend who does have training in physics, he pointed out the following piece as a convincing set of arguments for dark matter: https://bigthink.com/starts-with-a-bang/modifying-gravity/

    What’s wrong with Siegel’s arguments?

    Like

    1. He’s saying that the way General Relativity is modified to accommodate dynamics in galaxies (the MOND dynamic) fails to explain observation in cosmology. But prof. McGaugh posted a link in this very post (look for Aether Scalar Tensor) to a theory that does it! It is funny to see arguments such as “your theory cannot explain this”, because when it finally does you’re left with no argument.

      A better approach would have been to point out where MONDian theories don’t match observation in the regime they are supposed to cover. For instance, he mentions a speed difference of 50%-80% in clusters. Do you find that convincing? For me, it mostly means our understanding of clusters is still lacking, and when we get more/better data the difference will disappear.

      The bottom line is that it is much, much easier to find flaws in a theory that has only one free parameter and must explain observation in a vast and diverse set of phenomena. The fact that MOND has survived it for decades is… compelling. Posts like the one you linked are in *favor* of MOND when you read them carefully and critically.

      Liked by 1 person

    2. I have two major issues with Siegel’s post and are more or less related to him. I know that generally is not nice to kill the messenger, but the idea is that you have to take everything he says with a large tablespoon of salt. He is really biased towards LCDM and it shows.
      The first issue is this quote that I don’t know what to make of it: “If the acceleration caused by that central mass drops below a critical value — a new hypothesized constant of nature — then the acceleration isn’t determined by the gravitational force (or curvature of space) caused by the dominating mass, but rather reverts to that minimal value”.
      The way I read it is that when the acceleration would drop below a0, the acceleration becomes actually a0. At least this is how I understand “reverts to that minimal value”. But this is fundamentally different than MoND. In MoND you get a different spatial dependance (i.e. ~1/r instead of 1/r^2). So either Ethan is misleading his audience, either he doesn’t really understand MoND. From my point of view, both situations are severely damaging for his credibility.
      The second isue I have is this quote:”You only get to invoke the Tooth Fairy once” when he’s talking about clusters and MoND. Ok. But he forgets the history on CDM with that one. Just to point an extra tooth fairy – feedback. Or the N other DM flavors that are introduced ad hoc for a model fit.
      In my eyes, this looks like double standards, with LCDM held at a lower level.
      Of course – there are also other technical issues – he admits that MoND outperforms LCDM at galactic levels, but he just eludes by how big of a difference there is; he ignores the data from JWST whic chalange the structure formation scenario, he argues that only DM can fit the observed CMB, just to name a few. And the last two issues actually are challenging his assertion that modified gravity theories don’t work at large scales.

      Liked by 1 person

  12. As that discussion shows, there seems to be a theme among particle physicists to miss the basic point. My objection to dark matter is not that no candidate particle has been found; that’s just a symptom of it being the wrong side of the tree to bark up. The objection is in the web of self-contradictions and failed predictions: there was ample evidence against the dark matter interpretation before the end of the last century. That’s what I was saying in https://arxiv.org/abs/astro-ph/9801123.
    Independently of that, MOND is the only theory to consistently and correctly predict, in advance, the observations that are surprising in terms of dark matter. This has happened over and over again. The particle physics community is so far down the rabbit hole of the dark matter interpretation (they want their field not to be over), and so proudly ignorant of astrophysical constraints (most of them seem to grasp nothing beyond “rotation curves are flat therefore dark matter”), that they are apparently blind to anything that isn’t invisible.

    If hypothesis A and hypothesis B predict different things, and the predictions of our first guess A are not observed but those of B are, then there is a clear preference for B as a scientific hypothesis. That’s science 101. That’s what has gone on here.

    Liked by 2 people

  13. mls,
    Seems to have run out of reply space in that thread, so;

    “The block universe has no preferred reference frame. Measurement from within a reference frame permits one to use lightyears as a fixed length. The change in wavelength is attributed to the “why” of how patterns of spectral lines from light sources are shifted with respect to different distances within a reference frame. As there is no reason to postulate this to be different in any other reference frame, it is assumed to be a phenomenon common to all reference frames.”

    Presumably there is no reference frame, because that would be inconvenient. Yet lightspeed does seem to be the most common metric used in cosmology. What is it measuring, if not the overall frame that is intergalactic space?
    As you say, “distance.” Yes, distance is the amount of the metric, that is the denominator. Thus the numerator.
    Quite literally the theory postulates cosmic redshift is due to the increasing distance light has to cover, as the source and observer move away, thus taking longer for the light to cross, but then argues this expanding space is not calibrated in terms of lightspeed.

    This really isn’t so much logic, as it is crowd psychology. Houdini making the elephant disappear. Thanks for pointing out that no one will answer the question.

    Like

    1. brodix,

      As I noted elsewhere, Einstein did not view general relativity as “geometrizing,” although others do. This may be somewhat disingenuous since he also praised the value of Riemann’s work in “geometry” for its usefulness.

      However, a differentiable manifold does not have a “metric” in the sense of intuitive space. It is generally perceived as a “gluing” of patches of flat real spaces. The gluing is accomplished by assuming the compatibility of infinitely differential functions on each patch. This must be done this way to accommodate infinitesmal rotations, among other things.

      There is some vagueness here, as will ordinary calculus, and methods beyond my technical knowledge accommodate it for purposes of calculation. But, that vagueness is one reason William Lawvere has been able to formulate a category-theoretic notion of infintesimal analysis using intuitionistic reasoning. At the level of the “gluing” linear algebra comes into play. Lawvere speaks of “microaffinities.” The classical method speaks of “connections.”

      So, a “metric” in this mathematics is based upon an object related to linear algebra called a tensor. Distances do not refer to points as much as they refer to the functions that “glue” the patches of real spaces together.

      It may well be that this is not “logically sound.” Berkeley launched the same criticism at Newton. But, it works with respect to calculations that make testable predictions.

      One way to understand the logical problem is to recognize that “identity in time” differs from “identity” as spoken of in logic. Whether or not the scientific method resolves to truth, what it does do is provide a class of “defensible” possibilities.

      When Peter Woit suggests that Dr. McGaugh is being petulant, all it demonstrates to me is that Dr. Woit, a mathematician, does not have a natural apptitude for science. If, as I believe, a scientific posture demands an agnostic view toward truth, there is no reason the scientific method should produce the kind of answers that Peter Woit seeks.

      Like

      1. mls,

        I’m certainly not saying there isn’t a lot of circular mathematical reasoning involved, but does it actually explain, or merely patch?
        Math is abstraction. Like a map is an abstraction of the territory. Signals from the noise.
        Yes, it goes very deep into our psychology, as all we really have are the maps our minds devise. Like when there is a stick in the grass and our primate mind says “Snake!” Then we devise higher level maps and build societies around them. Languages, religions, etc.
        When you assume the map to be infallible and the territory can only be understood through its description, it is not science, it’s ideology.Which is a very strong social instinct. Much stronger than mere logic, because it’s about survival, not just simple facts.

        So let’s get back to the issue at hand; What is the reason given for cosmic redshift?

        Like

        1. brodix,

          One time I tried to give a succinct answer explaining that our measurements are within a reference frame with extension to other reference frames be an assumption. In response, you replied in a manner still trying to use intuitive geometry for the block universe.

          To that, I tried to explain that the nature of the mathematics involved with the block universe is such that one cannot directly apply intuitive geometric notions. In response, you effectively attacked mathematics generally.

          While there may ultimately be issues corresponding to “the map is the territory” people who invoke such language could not run power plants or manage telecommunication networks. It simply was not an appropriate response.

          Out of respect for Dr. McGaugh’s subject matter, I cannot continue in this way. Both of my attempts had been appropriate. I have nothing left to answer you with.

          Like

            1. They do have to function in that interface between map and territory.
              The ones that are too rigid to handle it are promoted to management.

              Like

              1. Out in that corporate world, the mathematicians are the accountants and presumably they are not allowed to just write in a figure and call it dark money, whenever there is a gap in the books, because falsification is supposed to be a thing and ignoring it can cause trouble, but it does appear that impulse towards deranged speculation is quite strong and a lot of the money seems to be increasingly dark.

                Like

      2. mls,
        Consider that in a moving frame, the clock and ruler dilate equally. To the point that at the speed of light both ruler and clock go to zero. What if we were to go the opposite direction, to the fastest clock and longest ruler? Wouldn’t that be the one closest to the equilibrium of the vacuum, the unmoving void of absolute zero?
        Would that be the the Big Frame?

        Like

      3. “If, as I believe, a scientific posture demands an agnostic view toward truth, there is no reason the scientific method should produce the kind of answers that Peter Woit seeks.”
        Can you explain this a little bit. From my point of view Peter Woit criticizes in every second blog post a string theorist for his fruitless theory and in the last posts he opposes the unserious acting of the quantum computer community.
        He behaves like a mathematician and has to say sometimes that he is a particle physicist…
        Which answers is Peter Woit looking for?

        Like

        1. Mr. Freundt,

          When one studies the foundations of mathematics, it quickly devolves into systems and “followers.” I am not critical for his stances on how some physicists apply mathematics. He will indoubtedly obtain a “following” among people knowledgeable about mathematical physics.

          However, he is a mathematician, and, my view of him rests on that fact.

          Right or wrong, the vast majority of mathematicians working on foundational mathematics work within the paradigm of first-order model theory. This work is made more difficult because it overlaps with analytical philosophy – something most mathematically inclined people do not wish to deal with.

          Whenever Dr. Woit speaks of foundations, he dismisses the hard work of other mathematicians. Unquestionably, the Langlands program has foundational significance. But, the hard work is tying that program to existing foundational programs – not simply dismissing ideas with which you do not agree.

          An excellent parallel can be found in Russell’s “Principles of Mathematics” where he overtly states that he is formulating a foundational viewpoint to exclude philosophies compatible with theological views of which he had been critical.

          Dr. Woit may be critical of physics which appears unscientific to him, but, his response is to pursue a program with the same flawed logic (belief in mathematics) that led string theorists to their ideas.

          I cannot emphasize the mention of the trilemma made by Madeline Birchfield enough. The rejection of both infinities and circularities so that “science” has a correspondence theory of truth is fatally problematic. One of Popper’s students investigated this in detail with the book at the link,

          https://archive.org/details/retreattocommitm00bart

          And, yes, this is a horribly nauseating philosophical text no mathematician wants to read. However, if you are concerned about the anti-science movement, the book documents its origins. And, it does so with respect to the choices available under the trilemma.

          In my view, Dr. Woit is to mathematics as Dr. Woit claims string theorists are to physics. And, of course, this is simply a personal opinion.

          Like

      4. People like Peter Woit and many theoretical physicists implicitly identify Reality with their models of Reality.

        That is why they are unable to even consider MOND as a possibility because the modified dynamics “must be” coming from a “particle”.

        For these people the possibility of strong emergent properties or behavior is in direct contradiction with “reality”(their models of reality).

        That galaxies rotational speed behavior is “tantamount of a new natural law for rotating galaxies”(strong emergent property) is sacrilegious as their dreams of a Theory of Everything (the ultimate religion) will come crashing down.

        Strong emergent properties are incompatible with naive reductionism.

        Like

        1. The irony here is that basic physical processes are at work. The feedback loops coalescing around useful insights and ideas, eventually exceeding the situations, relations and functionality that gave birth to them, leaving a hardened, encrusted form, motivated by its own inertia.
          Given how prevalent this dynamic is in so many aspects of life and basic physical processes, that the sciences haven’t acknowledged and recognized the pattern does point to a certain degree of primitivism.
          The fact that epicycles really were brilliant math, for their times, should have been a warning to not simply calculate from initial propositions, filling in the gaps and assuming the patches to be validated by their necessity to the equation.
          The principle of falsification is only honored in theory. In reality it amounts to heresy.
          Authority and tradition rule. Tribalism runs deep.

          Liked by 1 person

        2. jeremyjr01,

          We all suffer from the desire that our personal knowledge has some attachment to reality – myself included. But, your remark transitioned from theoretical physicists to particle physicists in one great leap. Dr McGaugh’s problems extend beyond particle physicists.

          In my attempt to explain how we use measurements from within a single reference frame I noted an assumption in physics that physical laws are the same everywhere. In that case, we use that assumption to extrapolate the shifted spectral lines from a single datum (our reference frame). If an astronomer has come to believe that general relativity is, in fact, a law of physics, MOND has the appearance of violating that assumption. The law of gravity is one thing here and another there.

          Dr. McGaugh’s problem is wider than simply the opinions of particle physics.

          As for the common lore about emergence, my original scientific interest had been biology. Unless I see a convincing argument that removes the theory of evolution from established science, emergence is secondary to an explanation of how an evolved biological organism has a faculty (presumably mathematics) to *know* the truths of material reality.

          I do not claim that the theory of evolution is true; rather, I ask for the explanation which needs to be provided in the event that it is true. I tire of “in principle” arguments writing human agency out of the picture using equations on a blackboard.

          When I learn about metrics in a mathematics book, I am taught about distance functions having a Cartesian product as a domain. Should I imagine that physicist runs and grabs a Cartesian product to make measurements in a lab?

          What a physicist does is to partition intuitive space into a part that measures and a part being measured. After the measurement is taken, the part that measures is “discarded” in order to attribute the newly acquired datum to a “natural law.” This is a necessary activity to explain the phenomena we witness as being objective in nature.

          Since I believe in an external reality, I see no alternative to science being conducted in such a manner. But, I wish those people who view their own consciousness as illusory would stop telling me to engage in arguments about “emergence” on the basis of Cartesian products. I believe they have the burden of proof dictated by the circularity introduced through thd theory of evolution.

          But, I am no one.

          Like

          1. mls,

            What is evolution, other than cycles of expansion and consolidation, where the basic organic impulse expands to fill every niche and combination of impulses, from genes, to ecosystems, to seasons. Then the consolidation phase decides what is seed and what is fertilizer. Even emergence is part of the cycle.
            The old information, the measure, is lost and the new becomes basis for the next cycle.
            Energy radiating out, as form coalesces in.
            Consciousness to the future, thoughts to the past.

            Like

    1. Mr. Freundt,

      I have no credentials. For reasons involving a tragedy ending any hope of an academic career, I have studied the continuum question. However, I did not succumb to the Goedel personality cult and the concomitant deference to the Goedel-Cohen result. Instead, I learned that that result rests within one of many paradigms all trying to claim a finish to the sentence, “Mathematics is ….”

      Two days after my answer at the link,

      https://philosophy.stackexchange.com/questions/91773/which-philosophy-topics-are-necessary-for-philosophy-of-mathematics

      had been accepted, PSE changed their login policies. If you look at that answer, you will find that I have done the hard work to actually have an opinion.

      At the end of that link — after recomnending that the student avoid foundations at all costs — I direct the questioner to the FOM mailing list.

      FOM has a traditional bias. For category theorists and advocates of HOTT (homotopy type theory) there is the n-category cafe.

      You will find no participation from Dr. Woit on neither of those forums.

      The vast majority of mathematicians are respectful of mathematical topics outside of their expertise. I do not see that from Dr. Woit. Character counts.

      Like

      1. mls,
        I studied physics and got my PhD in semiconductor physics / laser physics.
        But that is not important. I was shaped before.

        P.W.Anderson and “more is diffrent”.
        Reductionism: The ability to reduce everything to simple basic laws,
        does not imply the ability to start from these laws and reconstruct the universe.

        Emergence: Many (relatively) identical objects lead to new order phenomena and, thus to new laws.
        Evolution: adaptation of individuals to their environment over generations.
        I would not say that one has much to do with the other.

        Dogmas and myths in theoretical physics and related mathematics:
        I miss thing-like explanations the most. 2 examples:
        1. how does the muon, which was created at 20km height, know that it moves almost with the speed of light
        and from its view, the earth’s atmosphere is only 200m thick ?
        2. which “objects” do electrons, positrons, and photons consist of?
        so that they are sometimes the ones with charge and mass (electron)
        and sometimes the other without charge and without mass (photon) ?

        The last good material explanation came from Copernicus,
        who placed the sun in the center of the planetary system
        and could explain with it the retrograde motion of the planets without epicycles.

        Peter Woit and dogmas
        One must decide for something. And Peter has chosen “Euclidean Twister Geometry”.
        I have doubts that he himself believes in it. Otherwise, he would report about it in every second blog post.

        Like

        1. Steven, I enjoy reading your posts. They are direct, succinct.
          “ One must decide for something.”
          This statement is, I believe, absolutely true. But, I might add that the “decision one makes” ends up steering his life in directions he may not have anticipated. The “professional” might have to “reinforce” his professionalism. Perhaps, to the point of intransigence or intolerance.
          I would submit that it is possible for a cerebral man to make a good living “pushing a broom”. It frees him from deciding on a “dogma”. The problem for that man then becomes one of “satisfying his inner curiosities” as his circle of acquaintances cannot be trusted to provide an adequate sounding board for his questions. In the old days such a man was often found roaming the public library. And “professionals”, with access to their own knowledge base, never encountered him.

          Like

          1. “ends up steering his life in directions he may not have anticipated.”
            absolute
            Every married husband knows that

            Like

          2. Brad,

            It does create the interesting situation for those of us on the outside, who can sense the field is spiraling into an abyss, adding ever more grandiose patches to its models, that we can only sit back and calculate, not only when the reality check arrives, but how much collateral damage is done, not only to the field itself, but its foundations; academic, political, social, etc.
            It is safe to say theoretical physics is the high priesthood of modern culture, given its past achievements and the technological results, but what are the fundamental issues causing it to spin its wheels for the last generation?
            For instance, how much is this focus on the particle/atom/quanta, also a reflection of Western individuality and its many expressions, from monotheism, to the Big Bang singular chronology? How much is the capitalist focus on the bottom line, to the exclusion of all other considerations, another version of naive reductionism?
            I realize most of the people engaged in this forum are naturally focused on the field and are not concerned with this larger context, but that also goes to the reasons for this blindness. Specialists, to the exclusion of generalists, results in a Tower of Babel, with everyone talking past everyone else, in disconnected languages.
            It doesn’t take more than a glance at the news to know the mother of all reality checks is in the mail.
            The real question should not be; what particle will solve our dilemma, but how does nature function in the first place.
            Particles and fields.
            Nodes and networks.
            Organisms and ecosystems.
            Synchronization and harmonization.
            Structure in, energy out.

            Like

        2. “The last good material explanation came from Copernicus,
          who placed the sun in the center of the planetary system
          and could explain with it the retrograde motion of the planets without epicycles.”

          No. Copernicus needed epicycles (in his system the orbits of the planets were still circular).

          Best,
          Maurice

          Liked by 1 person

Comments are closed.