I’ve reached the point in the semester teaching cosmology where we I’ve gone through the details of what we call the three empirical pillars of the hot big bang:

  • Hubble Expansion
  • Primordial [Big Bang] Nucleosynthesis (BBN)
  • Relic Radiation (aka the Cosmic Microwave Background; CMB)

These form an interlocking set of evidence and consistency checks that leave little room for doubt that we live in an expanding universe that passed through an early, hot phase that bequeathed us with the isotopes of the light elements (mostly hydrogen and helium with a dash of lithium) and left us bathing in the relic radiation that we perceive all across the sky as the CMB, the redshifted epoch of last scattering. While I worry about everything, as any good scientist does, I do not seriously doubt that this basic picture is essentially correct.

This basic picture is rather general. Many people seem to conflate it with one specific realization, namely Lambda Cold Dark Matter (LCDM). That’s understandable, because LCDM is the only model that remains viable within the framework of General Relativity (GR). However, that does not inevitably mean it must be so; one can imagine more general theories than GR that contain all the usual early universe results. Indeed, it is hard to imagine otherwise, since such a theory – should it exist – has to reproduce all the successes of GR just as GR had to reproduce all the successes of Newton.

Writing a theory that generalizes GR is a very tall order, so how would we know if we should even attempt such a daunting enterprise? This is not an easy question to answer. I’ve been posing it to myself an others for a quarter century. Answers received range from Why would you even ask that, you fool? to Obviously GR needs to be supplanted by a quantum theory of gravity.

One red flag that a theory might be in trouble is when one has to invoke tooth fairies to preserve it. These are what the philosophers of science more properly call auxiliary hypotheses: unexpected elements that are not part of the original theory that we have been obliged to add in order to preserve it. Modern cosmology requires two:

  • Non-baryonic cold dark matter
  • Lambda (or its generalization, dark energy)

LCDM. The tooth fairies are right there in the name.

Lambda and CDM are in no way required by the original big bang hypothesis, and indeed, both came as a tremendous surprise. They are auxiliary hypotheses forced on us by interpreting the data strictly within the framework of GR. If we restrict ourselves to this framework, they are absolute requirements. That doesn’t guarantee they exist; hence the need to conduct laboratory experiments to detect them. If we permit ourselves to question the framework, then we say, gee, who ordered this?

Let me be clear that the data are absolutely clear that something is wrong. There is no doubt of the need for dark matter in the conventional framework of GR. I teach an entire semester course on the many and various empirical manifestations of mass discrepancies in the universe. There is no doubt that the acceleration discrepancy (as Bekenstein called it) is a real set of observed phenomena. At issue is the interpretation: does this indicate literal invisible mass, or is it an indication of the failings of current theory?

Similarly for Lambda. Here is a nice plot of the expansion history of the universe by Saul Perlmutter. The colors delineate the region of possible models in which the expansion either decelerates or accelerates. There is no doubt that the data fall on the accelerating side.

I’m old enough to remember when the blue (accelerating) region of this diagram was forbidden. Couldn’t happen. Data falling in that portion of the diagram would falsify cosmology. The only reason it didn’t is because we could invoke Einstein’s greatest blunder as an auxiliary hypothesis to patch up our hypothesis. That we had to do so is why the whole dark energy thing is such a big deal. Ironically, one can find many theoretical physicists eagerly pursuing modified theories of gravity to explain the need for Lambda without for a moment considering whether this might also apply to the dark matter problem.

When and where one enters the field matters. At the turn of the century, dark energy was the hot, new, interesting problem, and many people chose to work on it. Dark matter was already well established. So much so that students of that era (who are now faculty and science commentators) understandably confuse the empirical dark matter problem with its widely accepted if still hypothetical solution in the form of some as-yet undiscovered particle. Indeed, overcoming this mindset in myself was the hardest challenge I have faced in an entire career full of enormous challenges.

Another issue with dark matter, as commonly conceptualized, is that it cannot be normal matter that happens not to shine as stars. It is very reasonable to image that there are dark baryons, and it is pretty clear that there are. Early on (circa 1980), it seemed like this might suffice. It does not. However, it helped the notion of dark matter transition from an obvious affront to the scientific method to a plausible if somewhat outlandish hypothesis to an inevitable requirement for some entirely new form of particle. That last part is key: we don’t just need ordinary mass that is hard to see, we need some form of non-baryonic entity that is completely invisible and resides entirely outside the well-established boundaries of the standard model of particle physics and that has persistently evaded laboratory signals where predicted.

One becomes concerned about a theory when it becomes too complicated. In the case of cosmology, it isn’t just the Lambda and the cold dark matter. These are just a part of a much larger balancing act. The Hubble tension is a late comer to a long list of tensions among independent observations that have been mounting for so long that I reproduce here a transparency I made to illustrate the situation. That’s right, a transparency, because this was already an issue before end of the twentieth century.

The details have changed, but the situation remains the same. The chief thing that has changed is the advent of precision cosmology. Fits to CMB data are now so accurate that we’ve lost our historical perspective on the slop traditionally associated with cosmological observables. CMB fits are of course made under the assumption of GR+Lambda+CDM. Rather than question these assumptions when some independent line of evidence disagrees, we assume that the independent line of evidence is wrong. The opportunities for confirmation bias are rife.

I hope that it is obvious to everyone that Lambda and CDM are auxiliary hypotheses. I took the time to spell it out because most scientists have subsumed them so deeply into their belief systems that they forget that’s what they are. It is easy to find examples of people criticizing MOND as a tooth fairy as if dark matter is not itself the biggest, most flexible, literally invisible tooth fairy you can imagine. We expected none of this!

I wish to highlight here one other tooth fairy: feedback. It is less obvious that this is a tooth fairy, since it is a very real physical effect. Indeed, it is a whole suite of distinct physical effects, each with very different mechanisms and modes of operation. There are, for example, stellar winds, UV radiation from massive stars, supernova when those stars explode, X-rays from compact sources like neutron stars, and relativistic jets from supermassive black holes at the centers of galactic nuclei. The mechanisms that drive these effects occur on scales that are impossibly tiny from the perspective of cosmology, as they cannot be modeled directly in cosmological simulations. The only computer that has both the size and the resolution to do this calculation is the universe itself.

To account for effects below their resolution limit, simulators have come up with a number of schemes to account for this “sub-grid physics.” Therein lies the rub. There are many different approaches to this, and they do not all produce the same results. We do not understand feedback well enough to model it accurately as subgrid physics. Simulators usually invoke supernova feedback as the primary effect in dwarf galaxies, while observers tell us that stellar winds do most of the damage on the scale of star forming regions – a scale that is much smaller than the scale simulators are concerned with, that of entire galaxies. What the two communities mean by the word feedback is not the same.

On the one hand, it is normal in the course of the progress of science to need to keep working on something like how best to model feedback. On the other hand, feedback has become the go-to explanation for any observation that does not conform to the predictions of LCDM. In that application, it becomes an auxiliary hypothesis. Many plausible implementations of feedback have been rejected for doing the wrong thing in simulations. Only maybe one of those was the right implementation, and the underlying theory is wrong? How can we tell when we keep iterating the implementation to get the right answer?

Bear in mind that there are many forms of feedback. That one word upon which our entire cosmology has become dependent is not a single auxiliary hypothesis. It is more like a Russian nesting doll of multiple tooth fairies, one inside another. Imagining that these different, complicated effects must necessarily add up to just the right outcome is dangerous: anything we get wrong we can just blame on some unknown imperfection in the feedback prescription. Indeed, most of the papers on this topic that I see aren’t even addressing the right problem. Often they claim to fix the cusp-core problem without addressing the fact that this is merely one symptom of the observed MOND phenomenology in galaxies. This is like putting a bandage on an amputation and pretending like the treatment is complete.

The universe is weirder than we know, and perhaps weirder than we can know. This provides boundless opportunity for self-delusion.

54 thoughts on “Tooth Fairies & Auxiliary Hypotheses

  1. Can you think of any experiment or observation that is currently technically feasible that would convince the skeptics that General Relativity is wrong and must be modified, irrespective of cost? I think I have asked this of you before, and your answer was negative. If that is the case, does this mean that this issue is doomed to remain essentially an unanswerable philosophical question into the foreseeable future?

    Like

      1. MOND is an “emergent” property of a complex star system, it only manifest at that complexity level.

        “Testing” MOND in a lab will be like trying to test a herd behavior using a quantum mechanics experiment.

        Reality is hierarchical, every complexity level has new irreducible properties that can’t be “tested” at lower complexity levels.

        That is why pursuing a theory of everything is an exercise in futility.

        By the way this already have been shown in formal mathematics: Complexity is a source of irreducibility, incompletes, randomness.

        The axiomatic method, at the center of Physics, is intrinsically limited.

        Like

        1. Could you explain in more detail why MOND is an “emergent” property. Since you can write a MOND Lagrangian, can’t these predictions be tested in appropriate environment?

          Like

          1. MOND is an “emergent” property because it is a property of a complex system of stars, and galaxies rotational “rigidity” is a very strong indication of that, exactly like solids rigidity is an emergent property of a complex set of quantum objects, you don’t need to postulate any new elementary particle to explain solids rigidity.

            Dark Matter and String Theory are manifestations of the same problem: a departure from objectivity in scientific practice.

            Like

    1. In one of Milgrom’s paper’s he talks about the gravitational saddle points that exist in the solar system. The biggest is some small patch of space between the sun and Jupiter.
      We could launch a mirror out beyond the influence of the sun’s gravity (galaxy dominant regime) and measure acceleration. But one could just say, yeah of course this is CDM.
      (A mirror so we could bounce photons off it and detect them back here on earth.)

      Like

  2. If General Relativity already fails modeling the rotational behavior of galaxies it is very naive to expect that in higher complexity, as galaxy clusters it will better, even less at Universe scale complexity.

    It seems that “More is Different”(P. H. Anderson) is the only universal truth: complexity is always a boundary for the predictive/explanatory power of any theory.

    Liked by 1 person

  3. “More is Different”(P. H. Anderson) is very nice to read and very interesting.

    Expansion of the universe
    I always wonder if the space between galaxies is getting more or thinner.

    Like

  4. With all due respect, the three empirical pillars you cite as supportive of the expanding universe model are anything but. In the first case, the misleadingly named Hubble Expansion rests entirely on the recessional velocity interpretation of the cosmological redshift that Hubble himself never fully accepted, and the FLRW assumption of a Universal metric that the field equations of GR could be solved for. Those are foundational assumptions of the expanding universe model. Neither assumption is supported by direct empirical evidence and citing the assumptions of the model as evidence for the model is circular reasoning.

    As far as BBN goes, the nuclear abundances are what they are, but the BBN, like the BB event itself, is not an empirically observable event and the theoretical BBN account has yet to be fit to the lithium abundance. (Not for want of trying.) It is also cannot be decisively shown that nucleosynthesis can only take place in a one-off BB scale event. In terms of the model, the BB event has to theoretically produce nucleosynthesis but nucleosynthesis does not require a Big Bang event.

    Prior to its discovery in 1965, estimates for the CMB temperature by Big Bang theorists ranged over an order of magnitude and that range (5-50K) did not encompass the observed 2.7K value. At the same time theorists using thermodynamic or tired-light considerations produced more accurate predictions. Subsequently the BB predictions were recalculated to fit the observation and have been continuously refit to the data at every step since. The CMB does not provide strong evidence for the expanding universe model.

    In its current guise (LCDM) the expanding universe model asserts and requires the existence of numerous events and entities that have no empirical correlate in physical reality; the model is a scientific failure. That the mathematical model can postdict every new observation is just a feature of math models generally; the qualitative model described by the LCDM math is as absurd as it is unobservable.

    The failure of the expanding universe model does not in any way constitute a failure of science; failure is integral to the scientific method as it explores the limits of current knowledge. The scientific failure rests entirely with the theoretical physics community of the scientific academy which has not only refused to acknowledge the standard model’s repeated failures but has actively suppressed any research threatening the model’s foundational assumptions.

    The fate of Halton Arp at the hands of the scientific orthodoxy is analogous to Galileo’s fate at the hands of the Catholic Church. It may have temporarily served its purpose as a warning to other would-be apostates but it will forever hang as a badge of shame on the scientific academy of the 20th century for allowing the theoretical physics community to devolve into a cult of belief wherein the empirical requirements of science became an afterthought honored only in the breach and fealty to the golden calf of the expanding universe a requirement for funding and any hope of a career.

    The institutional solution to this mess is straightforward. A formal Department of Modern Cosmology needs to be established for the express purpose of developing and evaluating cosmological models, based on observations, that do not share the naive, century-old foundational assumptions of the expanding universe paradigm. It is irrelevant whether such an entity is an independent, privately-funded institution or a new department at an ambitious, established academic institution looking to make a name for itself by breaking the intellectual stranglehold that theoretical physics has on cosmological research.

    The scientific community writ-large has to intervene and free cosmological researchers from the yoke of the expanding universe model that has been imposed on them by the consensus of a theoretical physics community enthralled by its own mathematical cleverness. The motto over any new Cosmology Department should read: Math Is Not Physics.

    Liked by 1 person

    1. Mainstream ideas have an enormous economic interest behind them, creating what you can consider as a “bubble” not different than economic bubbles and like them “bubbles are often conclusively identified only in retrospect, after the bubble has already popped and prices have crashed.”

      These economic interests will actively resist anything that may seems like a threat to them.

      Corruption is intrinsic to human nature.

      Liked by 1 person

    2. How much does it reflect much deeper conceptual issues, merely reaching the point of reductio ad absurdum?
      It seems our ability to abstract hasn’t fully recognized garbage in, garbage out, so there becomes this feedback loop of reenforcement, when initial misconceptions get hardwired into the social dynamic, then too much gets built on top. Not just science, but politics, economics and religion. Then we end up with these Towers of Babel, with the experts preaching the Canon and simple logic is beyond the pale.
      To offer an example;
      Is time really the point of the present, moving past to future, codified as measures of duration, or is it change turning future to past? As in tomorrow becomes yesterday, because the earth turns.
      As these mobile organisms, this sentient interface between our bodies and their situation operates as a sequence of perceptions, in order to navigate, so our experience of time is as this linear passage from past to future events, but isn’t that like trying to explain the sun moving east to west, before realizing it’s the earth turning west to east?
      There is no literal dimension of time, because the past is consumed by the present, to inform and drive it. Causality and conservation of energy. Cause becomes effect.
      I could flesh this out a bit more, but the question is whether anyone is willing to argue the logic, as opposed to saying I must be wrong, because the masters of the field have pronounced otherwise?
      We define ourselves by our beliefs, as well as our company and few are willing to examine themselves and their culture too deeply.

      Like

  5. Newton made a set of assumptions to achieve his breakthrough Theory of Gravity.
    Einstein was able to explain observations that brought Newton’s theory into question by changing (throwing out) one of Newton’s assumption, namely that Time and Space are fixed backgrounds.
    Now we have observations that bring Einstein’s theory into question.
    I have been looking at the basic assumptions Einstein made. Most were inherited from Newton.
    I am wondering if any real expert in GR has looked at what happens if one changes the assumption that Gravity always spreads out evenly in all directions, also called the Inverse Square Law. It seems to me that Milgrom’s MoND could be reformulated as a change to the Inverse Square law, though I don’t know if anyone has tried to do that. Perhaps I should give it a go.

    Like

    1. I myself have tried 2 dimensions. This leads to a 1/R dependence and agrees with MOND. But then I have not the slightest idea to explain root(mass*gravitational constant) …

      Like

    1. It is conceivable, but seems to me unlikely. Foreground subtraction is a huge technical challenge, and dust is certainly a big foreground. But this is the sort of non-fundamental technical challenge that large groups like Planck excel at.

      If there is some deeper problem, like intergalactic dust that makes the signal non-cosmic, then we have a much bigger problem.

      Like

      1. I agree the CMB spectrum will be a challenge at the least to reproduce from cosmic dust as radiator.

        The thing is, the CMB interpreted as BB afterglow assumes inflation as historic fact. Inflation is required for explaining the CMB that way, and inflation lasted just too short and at too high energies to falsify it in any way, except for explaining the Standard Model of particles (like we now understand atoms) and prove there cannot be an inflaton particle. I have found many reasons for myself to believe the bible as authority over other religions, and no inlation in Genesis – although with a rotating universe the 13 billion years is no problem, due to General Relativity. So if inflation cannot be falsified in an experiment, I prefer to seek for some kind of String theory that explains – without dark matter – the standard model, and do the calculations on such high energies thereafter – if people keep stressing inflation is history.

        Like

        1. Well, without inflation we could explain the CMB but then we need some other theory to explain the CMB isotropy.

          Like

          1. This second statement is the correct one. We don’t need Inflation for the CMB, but it does provide a nifty explanation for the origin of the anisotropies. I think people in the field make that connection too strongly – what we see are Gaussian random fluctuations. Inflation does that, sure, but what the heck else would they be?

            Like

          2. If redshift is an optical effect, then the CMB would be the light of essentially infinite sources, shifted off the visible spectrum, so, literally the perfect black body radiation.
            The little bit of foreground structure to it would be the sources just over the horizon of visible light.
            That this effect would compound on itself would explain the parabolic curve in the rate of redshift, rather than Dark Energy.

            Like

        2. String theory is actually a good companion for dark matter: zero empirical evidence supporting it.

          String Theory is another “bubble” that has been bursting in slow motion.

          Like

  6. It’s just geometry. The gravitational effect follows the mass distribution which is roughly spherical near the center but flattens to a 2 dimensional disk. Near the center the gravitational energy density falls off inversely with the surface area implied by the radial distance from the center (1/r^2). As you go out the disk the drop off transitions to the inverse of the circumference of the disk (1/r).

    That geometrical transition is a what MOND effectively accomplishes although it is difficult to see it because the math is couched in terms of a second order derivative, acceleration.

    Like

    1. Even MOND have a limited complexity range of applicability because, again, higher complexity hierarchies beyond galaxies will impose a limit to its predictive power.

      Exactly as quantum mechanics is useless to explain living beings or even weather patterns.

      Complexity range is a boundary, not to confuse with distance range.

      A theory of everything, as P.H. Anderson said, will be a theory of almost nothing.

      Like

  7. Dear Prof. McGaugh,
    Thank you for the great posts!

    Could cosmic inflation also be categorized as a tooth-fairy?
    (it was introduced posterior, to fill the gaps of the original BB theory and no one can detect the relevant field)

    Like

  8. WRT:

    “The universe is weirder than we know, and perhaps weirder than we can know.”

    It is not the Cosmos that is weird, it is the expanding universe model that is weird. By weird here I mean very specifically that the model depicts a “universe” that looks nothing like the Cosmos we observe.

    Weird is cosmologists saying the expanding universe model is the greatest model ever so who cares if it bears no resemblance to physical reality. Weird is seeing things that aren’t there, while refusing to see to the things that are there. Weird is the expanding universe model.

    The Cosmos isn’t weird; it makes sense. However, the extent and “current state” of the Cosmos are unknowable – in terms of well known physics, specifically the max speed of light and the cosmological redshift. The extent and “current state” of the Cosmos are, in fact, unobservable and physically meaningless concepts. It’s a Cosmos, not a Universe.

    Like

    1. It first occurred to me in reading Hawking’s A Brief History of Time, when it first came out, that cosmology is messed up. In it, he made the point that “Omega=1.” Basically the expansion is in inverse proportion to overall gravitational effects. So space curves out, between galaxies, in inverse proportion to the degree it curves in, within galaxies.
      Remember Einstein first proposed the Cosmological Constant as a balance to gravity causing all space to collapse to a point. Which would mean Hubble essentially discovered the CC, as it was originally proposed.

      Like

  9. Hi Stacy: Thank you for this post on Tooth Fairies & Auxiliary Hypotheses.

    I agree entirely with your post and especially with your first two paragraphs and their ending “this basic picture is essentially correct”. Personally I would add a fourth empirical pillar of cosmology, namely that the geometry of space (space, not space-time) is isotropic & homogeneous on moderately large scales – I think there is good observational evidence to support this. It then follows that space-time is described well by the Friedmann Lemaitre Robertson Walker metric (FLRW metric). Most cosmology texts use “isotropic & homogeneous” to refer to the matter/density distribution, which only becomes uniform at very large length scales; over 100 million parsecs. However, “isotropic & homogeneous” were originally applied to just the three dimensions of space, and not to the matter distribution at all. The FLRW metric on its own says nothing about the matter distribution. The FLRW metric goes some way to explaining the empirical evidence of the early universe (such as its expansion), and it does this without any reference to General Relativity (GR). GR is only needed when we want to explain how the density evolves with time.

    Although there are areas of disagreement between supporters of the Lambda Cold Dark Matter (LCDM) theory and supporters of other theories (including modified gravity), most groups seem to agree with your pillars of cosmology and the basic ideas behind the hot big bang. And, on an optimistic note and despite the astronomical conflicts, there is a steady stream of new telescopes, new instruments, and new experiments coming along, and a continuous flood of new observations pouring in. Long may these continue. New data and new measurements are exactly what we need.

    Separately, might I suggest a new quote for your list (that might apply to the LCDM theory); from HL Mencken (slightly paraphrased): “For every complex problem there is always a solution that is neat, plausible — and wrong”.

    Like

    1. “space (space, not space-time) is isotropic & homogeneous”
      “which only becomes uniform at very large length scales; over 100 million parsecs.”

      I know this statement.
      And I know of no astronomical image which confirms this.
      On the contrary: all, really all astronomical images show structures:
      Planets, stars, nebulae, galaxies, clusters of galaxies, collections of clusters
      of galaxies up to the “Sloan Great Wall”.
      On really all length scales we observe structures.
      (An image showing a homogeneous universe would also be very boring…).

      To start a lecture with these words is not truthful.

      Why is it so hard to start a cosmology lecture with the words:
      “Dear students, you all know the wonderful pictures of the structures in the universe.
      Planets, stars, nebulae, galaxies, clusters of galaxies …
      Unfortunately, we can only calculate a homogeneous universe.
      So we will try this and see where it leads…”

      Liked by 1 person

      1. By the way, cosmology is not the only lecture which starts with a lie or dubious statement.
        In thermodynamics, in statistical physics, the Boltzmann distribution is derived like this:
        Imagine that the (ideal) gas consists of atoms, which collide elastically with each other like small steel spheres. Then, with a few more assumptions, you can derive the Boltzmann distribution. Works wonderful. The only mistake: atoms are no small steel spheres… All physicists know that…

        Like

        1. Yet physics seems in thrall to math, but with “shut up and calculate,” if it’s garbage in, it’s garbage out. What math doesn’t seem able to do, is think outside the program/box. The goal is to figure out what the program leads to, more than whether the program applies to the reality.
          Epicycles were brilliant math, but it took real “outside the box” thought to realize the earth is not the center.

          Like

    2. “this basic picture is essentially correct”.

      Preconceptions are filters for perception.

      If you already have a preconceived idea of how the “universe” looks like, given to you by GR that already failed at galaxy scale complexity, how is this not another form of confirmation bias?

      Like

      1. What is “space?”
        Three dimensions are really just a mapping device, like longitude, latitude and altitude.
        When you remove all the physical quantities from it, the two qualities remaining are equilibrium and infinity. Like a number line, from zero to infinity.
        Energy radiates, out, toward infinity, while structure coalesces in, toward equilibrium. Both entropic. Between black holes and black body radiation.

        Like

  10. FWIW, Alexandre Deur is pretty much the only person publishing in astrophysics who have a plausible, published way to explain the effects associated with the cosmological constant and/or dark energy without violating conservation of mass-energy and without introducing any new particles (except possibly a vanilla graviton) and also has an explanation of dark matter phenomena without new particles (except possibly a vanilla graviton).

    I won’t swear that he is indeed, as he claims, simply applying GR in a manner that others have overlooked, but any modification of GR which his approach contains is very subtle indeed and produces great results that do embrace the GR evidence coming before it.

    His work provides a theoretical foundation for MOND in spiral and elliptical galaxies, an explanation for how to overcome MOND’s cluster problems, and an excellent fit to the CMB, in principle with fewer degrees of freedom than GR and in practice with the same number of degrees of freedom as GR and fewer degrees of freedom than MOND.

    My annotated bibliography of his publications (many in peer reviewed journals) is found at: http://dispatchesfromturtleisland.blogspot.com/p/deurs-work-on-gravity-and-related.html

    With vetting of his work that he has mostly not received yet, and support from other more established astrophysicists, his theoretical foundation for MOND-like gravitational behavior from the self-interaction of gravitational field could finally unify the “gravity explain dark matter phenomena camp” and actually produce a major break with the LCDM paradigm for a big part of the astrophysics community.

    Like

    1. Not to put too fine a point on it, but a single solid co-authored paper by McGaugh and Deur would probably be enough to reach a tipping point for the theory, and win over a significant number of MOND skeptics in the process. This is the most Tooth Fairy-free solution on offer to the phenomena attributed to the dark sector. It also dovetails nicely with the gravity as QCD squared research program.

      Like

  11. How is the MOND acceleration constant, a(0) related to the gravitational constant, G?

    I don’t know how one would really conceive a model that describes complementary 3+1 and 1+3 spacetimes, but it makes philosophical sense to me.

    The idea being there are 3 space dimensions for a point in time, and 3 time dimensions (past, present, future?) for a point in space.

    Could a(0) be a measure or projection of the past, or a curvature in time?

    Maybe the “dark matter” doesnt interact because it doesn”t “presently” exist?

    Like

    1. At the moment, there is no connection. Nobody knows a connection. Both are independent constants.
      For a0 already Milgrom found: a0 ~ cH/6 resp. a0~cH/2pi
      H=Hubble constant
      c=speed of light
      But this is only an idea, a conjecture. The challenge is to justify this in a comprehensible way.
      Meanwhile I prefer the view that a0~cH/6 is something new. cH/6 looks like gravity, but has nothing to do with it. One point supporting this: a0 is independent of the mass of the galaxy. We (better Stacy) measure/observe the same a0 for small (light) and large (heavy) galaxies. If it had to do with gravity and mass of the galaxy, I would expect a dependence.

      Past, present and future
      The point is not to have an idea that might be right. The point is to derive the own idea from good preconditions…

      Like

      1. For time, we are mobile organisms, so this sentient interface between body and situation functions as a sequence of perceptions, in order to navigate, so our experience of time is as the present, moving past to future, but the evident reality is that change is turning future to past. Tomorrow becomes yesterday, because the earth turns.
        There is no dimension of time, because the past is consumed by the present, to inform and drive it. Causality and conservation of energy. Cause becomes effect.
        Time is asymmetric, because it is a measure of action and action is inertial. The earth only turns one direction.
        Different clocks can run at different rates simply because they are separate actions. Think metabolism. That culture is about getting everyone synchronized into a larger social organism, using the same languages, rules and measures, it might seem like there should be one universal, Newtonian flow of time, but the fact nature is so integrated and dense, is because everything doesn’t march to the beat of the same drummer. Multicultural, not monocultural.
        That different events are observed in different order from different locations is no more consequential than seeing the moon as it was a moment ago, simultaneous with seeing stars as they were years ago. It is the energy that’s conserved, not the information. That the information changes is time.
        Energy is “conserved,” because it manifests this presence, creating time, temperature, pressure, color and sound. Time is frequency, events are amplitude.
        Ideal gas laws correlate volume with temperature and pressure, but we don’t mistake them for projections of space, even though they are as foundational to our emotions and bodily functions, as sequence is to thought.
        Energy goes past to future, because the patterns it generates form and dissolve, future to past. Energy drives the wave, while the fluctuations rise and fall. No tiny strings necessary.
        Consciousness also goes past to future, while the perceptions, emotions and thoughts giving it form and structure go future to past. Though it’s the digestive, respiratory and circulatory systems processing the energy and feeding the flame, while the central nervous system sorts the information precipitating out. Signals from the noise.
        So there is an intellectual focus on the information over the energy. The left, logical side of the brain is the clock and ruler, seeing the forms generated by the frequencies and amplitudes, while the right, emotional side is the thermostat and barometer, feeling the energy of the waves building and receding.

        Like

        1. If few have the capacity to unravel complex emergent phenomena, why start there?
          Why not assume there is something very fundamental about an observer?

          If a great deal of success has been had in one very big camp, why imply that camp got it wrong? The beauty of complementarity is that you don’t have to describe all observations with one model.

          I am just as curious as anyone else about how to understand time. Curious if spacetime could be described in such a way that the space of cosmology becomes inseparable from the past, while the space of quantum mechanics becomes inseparable from the future.

          Maybe such a description would allow spacetime to be isotropic for an observer, but not likely homogeneous.

          Like

          1. I don’t know that they got it wrong, so much as it has been projected beyond where it is applicable. Maps/models necessarily contain distortions, because they have to edit. Signals from the noise.
            For one thing, no map could encompass the entire territory, or it would revert back to noise.
            But also the noise, the territory is foundational, not the map/signal(unless we are god, but that’s another issue. Ideals are not absolutes.) Perspective is subjective.
            So this narrative flow, from past to future, is foundational to both human perception and the cultures arising from it. Narrative is the basis of society, aka, history.
            So I wouldn’t say the construct of time as a dimension is wrong, but is there a physical basis for it? Will we, with the right application of mathematical faerie dust, be able to time travel through wormholes, around the “fabric of spacetime?”
            Or should we think of time as a measure of an effect, like temperature? Which is also all too real, to our experience of reality. Hot stoves will burn.
            If we delved further into the thermodynamics, it might help us explain and better understand the circular, cyclical, reciprocal and feedback aspects of time, that play out across those four dimensions. Often what we measure are cycles, because they are so regular.

            Like

      2. “One point supporting this: a0 is independent of the mass of the galaxy. We (better Stacy) measure/observe the same a0 for small (light) and large (heavy) galaxies. If it had to do with gravity and mass of the galaxy, I would expect a dependence.”

        That is an interesting point. Why would you expect a dependence?

        Like

        1. Let us assume for the moment that a0 has something to do with gravity and depends on the mass of the galaxy. We now measure the same (constant) a0 for small (light) galaxies (~1e5 stars) and for large (heavy) galaxies (~1e10 stars). If a0 depends linearly on the mass, for example, there should be a second cause which exactly counteracts this, so that we get a constant again. This is possible, but makes the matter/explanation more complicated.
          That’s why I think it has nothing to do with gravity.

          At the moment, I prefer to look for something that is the same for all galaxies. Then a constant a0 results automatically.

          Like

          1. Does MOND have something to do with mass?! Yes and No
            In the deep MOND regime we have
            a(MOND)=sqrt((MG/R^2)*a0)
            Let’s assume for the moment this formula contains a physical reality. Then:
            The first factor under the square root is the normal Newtonian gravity. The second factor is the MOND acceleration constant.
            a0 does not depend on the mass. In this sense MOND has nothing to do with mass. This is the “no.”
            But:
            If there is no mass, thus no galaxy, also a(MOND)=0.
            Mass is therefore necessary for a(MOND)>0. This is the “Yes”.
            At the moment I prefer a separation between (MG/R^2) and a0, because then I have an idea for the square root.

            Like

            1. What can we say of mass, other than it has weight and density?
              What can we say of gravity, other than it’s a centripetal effect associated with mass?
              Wouldn’t it be more logical to assume the qualities defining mass would be an effect of this centripetal dynamic?
              Look at galaxies; It’s this structural swirl inwards, from the barest bending of the light, to the vortices at the center. In which mass is an intermediate property, that we, as tactile, object oriented, tool using creatures, tend to assume is primary.
              Is it possible our anthropocentric perspective is biased?
              Then we don’t need Dark Matter and have to figure out what drives the bending inward.
              I tend toward something simple, like wave action tends to synchronize as a path of least resistance. Essentially entropy. ‘
              So structure coalesces, in, as energy radiates out.

              Like

            2. You mean like we keep verifying the expanding universe model by adding enormous patches to it, whenever observations don’t match predictions?
              The original sun centric models were less effective than the highly evolved geocentric models, because they tried using perfect circles, but the basic premise evolved.
              So it seems to me the question is whether all our axioms and their underlaying assumptions are valid, or do we need to go back and review.
              People do tend to get caught up in herd behaviors and fall in lockstep, synchronizing into a larger organism in many ways, from religion, to politics, to economics. Requiring a serious reality check on occasion. Just because the sciences claim to be pretty objective doesn’t mean this basic impulse doesn’t occur.
              Eventually some generation of theorists are going to rebel against spending their careers chasing increasing post empirical ideas.

              Liked by 1 person

              1. Exactly.

                And at the root of it is the quasi religious belief in naive reductionism, something already disproved in formal mathematics and by the many results from condensed matter Physics.

                Higher complexity structures with lower energy densities are “decoupled” from the properties/behavior of its elementary components at higher energy densities.

                Chaitin’s heuristic principle is telling:

                The predictive/explanatory power of any theory is bounded by the theory own complexity. In other words: anything more complex than the theory complexity is irreducible/unpredictable from it, or equivalently: complexity is a source of irreducibility/unpredictability.

                Like

  12. An oversimplification no doubt, but recall “the worst prediction in all of physics” involves the assumption that there is a present in which space at cosmological scales is comprised of space at quantum scales.

    The result was that we have no idea what dark energy is, and that based on quantum mechanics we are probably living in a hologram. Which is fine, until you get slapped in the face. Then you have to explain why this hologram hurts so much!

    Like

  13. jeremyjro1,

    “The predictive/explanatory power of any theory is bounded by the theory own complexity. In other words: anything more complex than the theory complexity is irreducible/unpredictable from it, or equivalently: complexity is a source of irreducibility/unpredictability.”

    Much that is complex is beyond modeling, because that would mean reverse engineering it and you run up against the wall of entropy. You can’t rewind time.
    Yet that also involves cycles of expansion/consolidation, increasing complexity becoming unstable and resetting to more stable bounds. “Punctuated equilibrium.”

    I think the real root isn’t so much reductionism, because there is no alternative. We are human.
    One big problem is assuming order as an end in itself, rather than a useful tool for building knowledge. The consequence is the map starts taking precedence over the territory. Which is where reductionism becomes a problem.
    Given all models are edits of the territory, the resulting close mindedness gets caught up in feedback loops, spiraling into those not entirely quasi religious beliefs/systems, where all information can only be processed in terms supporting the system.

    Adding to the dynamic is the process of building a knowledge base rests on one generation passing what it’s learned to the next generation, with those most successful being those attentive to what is taught, rather than being able to see it in some larger context. So it becomes another feedback loop, where core assumptions get built into the foundations, that would obviously be outdated if viewed in isolation, but are not open to analysis.

    Like

    1. Reductionism only works in a limited energy and complexity range, naive reductionism is to assume the existence of a “theory of everything”, or pretending that general relativity can describe the Universe, or pretending to have a Universe “wave function”.

      The predictive/explanatory power of any theory are always intrinsically limited, ignoring that is naive reductionism.

      Like

      1. Exactly.
        It is a map and if the map tried to incorporate all information in the territory, it would revert back to noise.
        There are much deeper conceptual and psychological issues into why we project out beyond the functionality of our conceptual objectives.
        For one thing, we mistake ideals for absolutes.
        For example, a spiritual absolute would be the essence of sentience, from which we rise, not an ideal of wisdom and judgement, from which we fell. The light shining through the film, rather than the images on it. When you think how that plays out, across the evolution of Western culture, the particular tangles of physical theory seem minor.
        Even math has come to see itself as absolute, rather than ideal.

        Like

Comments are closed.