I had written most of the post below the line before an exchange with a senior colleague who accused me of asking us to abandon General Relativity (GR). Anyone who read the last post knows that this is the opposite of true. So how does this happen?

Much of the field is mired in bad ideas that seemed like good ideas in the 1980s. There has been some progress, but the idea that MOND is an abandonment of GR I recognize as a misconception from that time. It arose because the initial MOND hypothesis suggested modifying the law of inertia without showing a clear path to how this might be consistent with GR. GR was built on the Equivalence Principle (EP), the equivalence1 of gravitational charge with inertial mass. The original MOND hypothesis directly contradicted that, so it was a fair concern in 1983. It was not by 19842. I was still an undergraduate then, so I don’t know the sociology, but I get the impression that most of the community wrote MOND off at this point and never gave it further thought.

I guess this is why I still encounter people with this attitude, that someone is trying to rob them of GR. It’s feels like we’re always starting at square one, like there has been zero progress in forty years. I hope it isn’t that bad, but I admit my patience is wearing thin.

I’m trying to help you. Don’t waste you’re entire career chasing phantoms.

What MOND does ask us to abandon is the Strong Equivalence Principle. Not the Weak EP, nor even the Einstein EP. Just the Strong EP. That’s a much more limited ask that abandoning all of GR. Indeed, all flavors of EP are subject to experimental test. The Weak EP has been repeatedly validated, but there is nothing about MOND that implies platinum would fall differently from titanium. Experimental tests of the Strong EP are less favorable.

I understand that MOND seems impossible. It also keeps having its predictions come true. This combination is what makes it important. The history of science is chock full of ideas that were initially rejected as impossible or absurd, going all the way back to heliocentrism. The greater the cognitive dissonance, the more important the result.


Continuing the previous discussion of UT, where do we go from here? If we accept that maybe we have all these problems in cosmology because we’re piling on auxiliary hypotheses to continue to be able to approximate UT with FLRW, what now?

I don’t know.

It’s hard to accept that we don’t understand something we thought we understood. Scientists hate revisiting issues that seem settled. Feels like a waste of time. It also feels like a waste of time continuing to add epicycles to a zombie theory, be it LCDM or MOND or the phoenix universe or tired light or whatever fantasy reality you favor. So, painful as it may be, one has find a little humility to step back and take account of what we know empirically independent of the interpretive veneer of theory.

As I’ve said before, I think we do know that the universe is expanding and passed through an early hot phase that bequeathed us the primordial abundances of the light elements (BBN) and the relic radiation field that we observe as the cosmic microwave background (CMB). There’s a lot more to it than that, and I’m not going to attempt to recite it all here.

Still, to give one pertinent example, BBN only works if the expansion rate is as expected during the epoch of radiation domination. So whatever is going on has to converge to that early on. This is hardly surprising for UT since it was stipulated to contain GR in the relevant limit, but we don’t actually know how it does so until we work out what UT is – a tall order that we can’t expect to accomplish overnight, or even over the course of many decades without a critical mass of scientists thinking about it (and not being vilified by other scientists for doing so).

Another example is that the cosmological principle – that the universe is homogeneous and isotropic – is observed to be true in the CMB. The temperature is the same all over the sky to one part in 100,000. That’s isotropy. The temperature is tightly coupled to the density, so if the temperature is the same everywhere, so is the density. That’s homogeneity. So both of the assumptions made by the cosmological principle are corroborated by observations of the CMB.

The cosmological principle is extremely useful for solving the equations of GR as applied to the whole universe. If the universe has a uniform density on average, then the solution is straightforward (though it is rather tedious to work through to the Friedmann equation). If the universe is not homogeneous and isotropic, then it becomes a nightmare to solve the equations. One needs to know where everything was for all of time.

Starting from the uniform condition of the CMB, it is straightforward to show that the assumption of homogeneity and isotropy should persist on large scales up to the present day. “Small” things like galaxies go nonlinear and collapse, but huge volumes containing billions of galaxies should remain in the linear regime and these small-scale variations average out. One cubic Gigaparsec will have the same average density as the next as the next, so the cosmological principle continues to hold today.

Anyone spot the rub? I said homogeneity and isotropy should persist. This statement assumes GR. Perhaps it doesn’t hold in UT?

This aspect of cosmology is so deeply embedded in everything that we do in the field that it was only recently that I realized it might not hold absolutely – and I’ve been actively contemplating such a possibility for a long time. Shouldn’t have taken me so long. Felten (1984) realized right away that a MONDian universe would depart from isotropy by late times. I read that paper long ago but didn’t grasp the significance of that statement. I did absorb that in the absence of a cosmological constant (which no one believed in at the time), the universe would inevitably recollapse, regardless of what the density was. This seems like an elegant solution to the flatness/coincidence problem that obsessed cosmologists at the time. There is no special value of the mass density that provides an over/under line demarcating eternal expansion from eventual recollapse, so there is no coincidence problem. All naive MOND cosmologies share the same ultimate fate, so it doesn’t matter what we observe for the mass density.

MOND departs from isotropy for the same reason it forms structure fast: it is inherently non-linear. As well as predicting that big galaxies would form by z=10, Sanders (1998) correctly anticipated the size of the largest structures collapsing today (things like the local supercluster Laniakea) and the scale of homogeneity (a few hundred Mpc if there is a cosmological constant). Pretty much everyone who looked into it came to similar conclusions.

But MOND and cosmology, as we know it in the absence of UT, are incompatible. Where LCDM encompasses both cosmology and the dynamics of bound systems (dark matter halos3), MOND addresses the dynamics of low acceleration systems (the most common examples being individual galaxies) but says nothing about cosmology. So how do we proceed?

For starters, we have to admit our ignorance. From there, one has to assume some expanding background – that much is well established – and ask what happens to particles responding to a MONDian force-law in this background, starting from the very nearly uniform initial condition indicated by the CMB. From that simple starting point, it turns out one can get a long way without knowing the details of the cosmic expansion history or the metric that so obsess cosmologists. These are interesting things, to be sure, but they are aspects of UT we don’t know and can manage without to some finite extent.

For one, the thermal history of the universe is pretty much the same with or without dark matter, with or without a cosmological constant. Without dark matter, structure can’t get going until after thermal decoupling (when the matter is free to diverge thermally from the temperature of the background radiation). After that happens, around z = 200, the baryons suddenly find themselves in the low acceleration regime, newly free to respond to the nonlinear force of MOND, and structure starts forming fast, with the consequences previously elaborated.

But what about the expansion history? The geometry? The big questions of cosmology?

Again, I don’t know. MOND is a dynamical theory that extends Newton. It doesn’t address these questions. Hence the need for UT.

I’ve encountered people who refuse to acknowledge4 that MOND gets predictions like z=10 galaxies right without a proper theory for cosmology. That attitude puts the cart before the horse. One doesn’t look for UT unless well motivated. That one is able to correctly predict 25 years in advance something that comes as a huge surprise to cosmologists today is the motivation. Indeed, the degree of surprise and the longevity of the prediction amplify the motivation: if this doesn’t get your attention, what possibly could?

There is no guarantee that our first attempt at UT (or our second or third or fourth) will work out. It is possible that in the search for UT, one comes up with a theory that fails to do what was successfully predicted by the more primitive theory. That just lets you know you’ve taken a wrong turn. It does not mean that a correct UT doesn’t exist, or that the initial prediction was some impossible fluke.

One candidate theory for UT is bimetric MOND. This appears to justify the assumptions made by Sanders’s early work, and provide a basis for a relativistic theory that leads to rapid structure formation. Whether it can also fit the acoustic power spectrum of the CMB as well as LCDM and AeST has yet to be seen. These things take time and effort. What they really need is a critical mass of people working on the problem – a community that enjoys the support of other scientists and funding institutions like NSF. Until we have that5, progress will remain grudgingly slow.


1The equivalence of gravitational charge and inertial mass means that the m in F=GMm/d2 is identically the same as the m in F=ma. Modified gravity changes the former; modified inertia the latter.

2Bekenstein & Milgrom (1984) showed how a modification of Newtonian gravity could avoid the non-conservation issues suffered by the original hypothesis of modified inertia. They also outlined a path towards a generally covariant theory that Bekenstein pursued for the rest of his life. That he never managed to obtain a completely satisfactory version is often cited as evidence that it can’t be done, since he was widely acknowledged as one of the smartest people in the field. One wonders why he persisted if, as these detractors would have us believe, the smart thing to do was not even try.

3The data for galaxies do not look like the dark matter halos predicted by LCDM.

4I have entirely lost patience with this attitude. If a phenomena is correctly predicted in advance in the literature, we are obliged as scientists to take it seriously+. Pretending that it is not meaningful in the absence of UT is just an avoidance strategy: an excuse to ignore inconvenient facts.

+I’ve heard eminent scientists describe MOND’s predictive ability as “magic.” This also seems like an avoidance strategy. I, for one, do not believe in magic. That it works as well as it doesthat it works at all – must be telling us something about the natural world, not the supernatural.

5There does exist a large and active community of astroparticle physicists trying to come up with theories for what the dark matter could be. That’s good: that’s what needs to happen, and we should exhaust all possibilities. We should do the same for new dynamical theories.

98 thoughts on “Take it where?

  1. Adopting GR was abandoning Newton gravitational theory? Obviously yes and no, it simply provided a more precise description of reality, at least at the solar system level.

    So far GR has failed to give a precise description where MOND provides predictive power, unless an ad hoc non observable dark matter is introduced.

    Empirical evidence reigns supreme in any objective approach to reality, and as Stacy McGaugh had said before predictive power is the golden standard.

    But it seems that the allure of a “general” theory of relativity, with strong shadows of a “theory of everything” is hard to dispell in humans strong tendency to religious/cultists mindsets.

    Any theory always has a limited explanatory/predictive power, and the history of science is there to show it.

    Like

  2. Stacy:
    I wrote a response on your last post “Imagine if you can” about some explanations about why dark matter folks can seem so much like religious fundamentalists. Not exclusive of anything that’s been said, but maybe to explain the depth of their fundamentalism. If you are interested you can find it by searching on “Isn’t that exactly what we might expect?”

    But my real interest is in asking a question about obvious questions that just seem to be totally ignored (search on “I have been thinking about dark energy a lot lately”). There seems to be huge questions that are just ignored, or the answer is just to apply a magical patch.

    Three example:
    The Ethan Siegel article I link, in answering where lost redshift energy goes, seems to conclude “redshift energy does work”. But wait – what work? Essentially he has explained nothing.

    The new theory that black holes are a source of dark energy. Vacuum energy black holes seems to be their answer, and even though we can’t explain how black holes get as big as they are, somehow they are also creating a source of energy which then (magically multiplies?) and becomes dark energy through black holes. Now as a layperson I can’t really say I understand their paper, but Sabine Hossenfelder seems to agree with my conclusion (my iterpretation of what she is saying in her video). But that leads to number three.

    Dr Hossenfelder seems to dismiss their arguments because “dark energy is just negative pressure energy”. Ok, but that doesn’t explain the source of dark energy (from what I read about negative pressure energy).

    This all seems to me like “I’ve got a theory that I want to get out there”, but please ignore the fact that I don’t address the fundamental problem, which is explaining where approximately 75% of the energy in the entire universe is coming from.” Am I missing something, or are folks that address dark energy just floating around in a void not addressing the fundamental problem, somewhat like dark matter folks don’t look at let alone address the fundamental issue of why MOND works perfectly when applied to galaxy rotation curves?

    Like

    1. Dark energy has an effective repulsion (unlike the otherwise strict attraction of gravity) because its equation of state (relating pressure P and density p through P = wp) has a negative sign: w = -1. That’s the rub. Normal stuff has a positive (or zero) value of w. That makes intuitive sense: more stuff can exert more pressure. So what does it mean to have w < 0? That does not make intuitive sense, but it is what gives dark energy the power to drive cosmic acceleration. Where you'd think it would take more energy to drive the accelerated expansion, you are in effect getting this for free from the negative sign. The expansion has to speed up to conserve energy because we've inserted this unnatural sign.

      You might consider that to be unphysical. You wouldn't be alone. But conventional cosmology needs it, so people accept that as evidence that it happens thusly without thinking too hard about it. I wouldn't say no one does – there was a lot of hand-wringing about it around the turn of the century, but since then we've mostly subsumed it into the set of known things that have become familiar so they seem normal even though they're not.

      Like

      1. Wow! Infinite energy! Has anyone told engineers that Perpetual Motion Machines are now possible?

        It seems to lead to what I would call “dismissive science”. Dr. Hossenfelder in her talk seems to say that we should just accept dark energy as a constant, and give it an innocuous name, so no one will think about it much – but won’t that discourage scientists (especially students who are being taught that) from pursuing any investigation into dark energy at all, so how likely is it that someone will “come up with a perfectly mundane explanation”? Magic answers should be identified for what they are, and this kept in mind as such so as not to discourage further investigation (even if it produces “black holes produce dark energy”).

        Btw, am I right in assuming that dark energy = negative pressure energy can neither be proven or disproven?

        In this Sean Carroll article : Energy is Not Conserved he says:

        “In the case of dark energy, that evolution is pretty simple: the density of vacuum energy in empty space is absolute constant, even as the volume of a region of space (comoving along with galaxies and other particles) grows as the universe expands. So the total energy, density times volume, goes up.

        This bothers some people, but it’s nothing newfangled that has been pushed in our face by the idea of dark energy. It’s just as true for “radiation” — particles like photons that move at or near the speed of light. The thing about photons is that they redshift, losing energy as space expands. If we keep track of a certain fixed number of photons, the number stays constant while the energy per photon decreases, so the total energy decreases. A decrease in energy is just as much a “violation of energy conservation” as an increase in energy, but it doesn’t seem to bother people as much. At the end of the day it doesn’t matter how bothersome it is, of course — it’s a crystal-clear prediction of general relativity.”

        From trying to figure out what GR says about energy conservation, I get the impression that it does not say anything specific – more that it does not say that energy has to be conserved. But to go where Sean Carroll, has gone, to just accept massive losses and gains in energy, lead to not even bothering to try to connect those two thing, even to show why a connect should be dismissed.

        There seems to be at least some disagreement whether energy conservation should be conserved (or that black hole-dark energy paper wouldn’t have been researched and published), but Dr Carroll’s attitude seem to discourage any thinning about energy gain or loss, even if on a massive scale. “Dismissive science”.

        A question: we see redshift, know it to be a loss of energy, and seem to assume redshift causes a loss of energy. Can’t it (or shouldn’t it be the other way around? Does light really get “stretched” (seems weird to me) and lose energy, or lose energy which by definition increases its wavelength? Wavelength and energy content are obviously tightly coupled, but could we be looking at the relationship backward? If we are, or even if it doesn’t matter because of the tight coupling, it would seem to have the potential to affect the way we look at things.

        Another question that arose as I was trying to get my head around these concepts, we current consensus is dark energy = the expansion of space, space is a specific and uniform energy density, etc. but I’ve never seen it said that “space” (in the sense of what is expanding through dark energy), i.e., “the framework of the universe” is obviously pure energy (pure=no matter involved). I am curious what I am missing here, or why it is never mentioned (at least in lay literature)

        And can I rant about the word “space”? We use it in the sense of indoor space, outdoor space, recreation space, the space between the stars, and, as you are probably thinking by now, the space between my ears. We need another word for that which is expanding out there through dark energy.

        Liked by 1 person

        1. I just read some of Stacy’s comments below (my notifications for this topic don’t seem to be working), and he is saying exactly what I was trying to say about “dismissive science”. Scientists so sure about certain topics they leave absolutely no room for questioning, or for considering alternatives. The dark energy mindset does not seem to be limited only to dark energy researchers/proponents.

          Like

        2. Not by chance “post empirical science” has become a thing (an oxymoron obviously). But that is the price of accepting non observable elements introduced to keep the current paradigm free of obvious contradictions.

          If GR appears to fail at galaxy level complexity then it seems to be an stretch trying to use it to get any “explanation” beyond that level of complexity.

          But naive Reductionism is defacto a sacred cow in mainstream scientific practice, hardwired even in dissidents mindsets.

          Like

          1. It does seems to be a pretty basic feedback loop. I think “drinking their own bathwater” is the technical term.
            The irony is the degree to which the very process of education is a big driver, where the conceptual sacred cows becomes so built into the edifice, that by the time anyone becomes even minimally informed, they’ve already passed the point of no return.
            It really does seem the only solution will be the point of reductio ad absurdum. Where it has become such a parody, that it simply falls apart. Though to us on the outside, it would seem that point was reached decades ago.

            Like

        3. Negative energy states are not unknown in the history of Quantum Mechanics. For example, Dirac’s theory of the electron required negative energy as well as positive energy states, so he hypothesised that all of them were filled with electrons (otherwise real electrons would fall into them). His theory predicted the positron, which he would have thought of as a bubble in the sea of negative energy electrons. Although later formulations of Quantum Mechanics avoided the need for the negative energy sea, the same approach works well for electrons and holes in semiconductors.

          Like

  3. I know scientists are only human, but it’s always seemed to me a failure to think clearly. We know that both QM and GR have to be not quite the full answer. It seems so obvious that we need what you called a UT in the last post. Some do get it, but I am astonished and dismayed at how many don’t.

    I recently read Lee Smolin’s “Three Roads to Quantum Gravity”: [1] Bring GR into QM; [2] Bring QM into GR; [3] Find a new theory that includes them both as approximations. Seems like the only quibble should be over the best road.

    Regardless, QM and GR clearly are approximations. Why is that so hard for some to understand?

    Liked by 1 person

    1. I’ve often wondered the same thing. I think it has, in part, to do with scientific communities. The people working on cosmology are largely distinct from those working on quantum gravity. Altering GR is anathema to the former but the starting place for the latter. They mostly ignore each other in part because quantum effects are thought to only matter at tiny scales, so shouldn’t matter to the large scales of cosmology.
      But I share your bewilderment. It never seems to occur to them that 1/(small number) is a big number, or that the problems each subfield faces might be related.
      Even within cosmology, I am bewildered that there exists a subsubfield of people who consider modified gravity theories to explain dark energy, but accept the need for dark matter without question, much less questioning that they might be related. Not everyone takes this attitude of course, but it is common. It is also common for people who do this to have come of age when dark matter was already received wisdom but dark energy was new and exciting.

      Liked by 1 person

      1. Though if redshift is an optical effect, that it could be compounding on itself would be the most efficient explanation for the parabolic curve in the rate. No Dark Energy required.

        Like

        1. Dark energy is just another not observable, ad hoc element introduced to keep the current paradigm free of obvious contradictions.

          But usually these ad hoc elements only make new contradictions to appear everywhere, as dark matter has.

          Like

          1. It would be understandable if it was politics, religion, ideology, partisanship, football, whatever, but when it’s science, it just means the model can’t be falsified.
            One would think there would be a great hue and cry, but it just reflects on the fact that logic cannot compare to faith, when it comes to being part of the crowd.

            Like

  4. General relativity isn’t even that general, there are even more general theories of gravity which include a spin tensor in addition to the usual energy-momentum tensor in general relativity. Anthony Lasenby and Chris Doran developed such a theory called gauge theory gravity from gauge theory first principles, which reduces down to general relativity when the spin tensor goes to zero. Their theory also takes place entirely in flat spacetime, which means that they dispense entirely with the metaphysical assumption of curved spacetime even in the usual realm of general relativity.

    https://en.wikipedia.org/wiki/Gauge_theory_gravity

    If you are right about general relativity with a cosmological constant being the only theory where the strong equivalence principle holds, then their inclusion of a spin tensor for torsion in Lasenby and Doran’s gauge theory gravity means that the strong equivalence principle fails in their theory.

    Liked by 1 person

    1. An AGI(artificial general intelligence) with almost unlimited memory and computing power will continuously be collecting raw sensory data to “predict” outcomes, no theory will be needed in the usual sense.

      Given that huge amount of data in memory for any new set of sensory data collected there will be almost similar, or close, subsets of memory data that can be used to build a statistical “prediction” in almost real time.

      But nothing will replace the continuous collection of raw sensory data (empirical evidence).

      Knowing (empirical evidence) takes precedence over “understanding”(theory, speculation).

      Like

      1. Yes, well, I will be interested to see how AI develops, and I would love if it could teach us things we don’t know in the ways you describe. But it isn’t enough to collect raw sensory data: it has to be sorted, analyzed, and *weighted*. Much of the issue with the debate about dark matter and MOND is how to weigh the evidence. Should we weigh the bullet cluster more? Or the systematics of galaxy dynamics? Humans are terrible at this: we suffer from cognitive bias, and are always inclined to give more weight to the raw data that conforms to our pre-established belief system, and ignore that which doesn’t. (I am at least aware of this effect in myself; many of my colleagues manifestly are not.) The AI development I’ve seen so far is, if anything, *even worse* at this than humans because (1) we are the training set for making such judgements and (2) algorithms lack the sentient ability to self-question. There’s no more important question in science than “maybe I’m wrong?”

        Liked by 1 person

    2. Thanks. I’m not sure you even need the SEP in GR, with or without Lambda. The Einstein EP, yes, but it is not obvious to me that the SEP is a requirement even though it was the initial starting point.

      Like

      1. Why do you say the EEP is necessary in GR? It certainly wasn’t an explicit part of Einstein’s derivation of GR the way the WEP was.

        Like

  5. “Another example is that the cosmological principle – that the universe is homogeneous and isotropic – is observed to be true in the CMB. The temperature is the same all over the sky to one part in 100,000. That’s isotropy. The temperature is tightly coupled to the density, so if the temperature is the same everywhere, so is the density. That’s homogeneity. So both of the assumptions made by the cosmological principle are corroborated by observations of the CMB.”

    The CMB dates back billions of years to about 300,000 years after the big bang. So, while there is evidence that the universe is about homogeneous and isotropic at the time of the CMB, there is really no evidence that the universe has remained homogeneous and isotropic in the time between the CMB and now. In fact, the existence of astronomical structures such as the KBC void indicate that the universe at recent times is less homogeneous and isotropic than the universe at the time of the CMB.

    Liked by 1 person

    1. I also point you to section VIII.F of the Snowmass 2021 paper, where they indicate various dipoles in both the CMB and in various other objects such as in the Hubble constant itself, in quasars, and in type 1a supernovae.

      https://arxiv.org/abs/2203.06142

      Taken together, this indicates that both the CMB and the late time universe are both not as isotropic as cosmologists have traditionally thought it to be.

      Like

    2. For the CMB dipole in particular, traditionally cosmologists have typically assumed the dipole to be kinematic in nature, but there is virtually no evidence to support that assumption. In section VIII.F.7, the Snowmass 2021 paper brings up the issue of whether the dipole is kinematic or intrinsic, and says that “given current CMB data, it is difficult to rule out an intrinsic component to the CMB dipole.”

      In addition, earlier in that subsection, they say “Given that the magnitude of the matter dipoles are currently discrepant with the magnitude of the CMB dipole, systematics aside, this implies that the Universe is not FLRW.” so even if the CMB dipole is found to be kinematic and consistent with FLRW at the CMB times, the resulting tension with the other dipoles at late times would imply a breakdown of the FLRW at late times.

      Like

    3. The above evidence is all independent of whether the strong equivalence principle holds or not in the universe, so even if general relativity still holds, based on current evidence cosmologists would still have to give up the assumption of FLRW and possibly do the very difficult work of keeping track of everything at all times in their cosmological models.

      Like

    4. “The CMB dates back billions of years to about 300,000 years after the big bang.”

      This obviously is not a fact but something totally outside of any possible confirmation by direct observation and assume the validity of a model of the universe.

      The only observable fact is that the portion of the universe that we can observe is not homogeneous, anything derived assuming the universe homogeneity is obviously wrong and contradicting empirical data.

      Liked by 1 person

      1. Also, regarding the CMB, there is a question whether the CMB actually represents something primordial in the universe. Stacy McGaugh certainly assumes so. On the other hand, in 2018, the cosmologist Vaclav Vavrycuk produced a model where the CMB is produced by electromagnetic radiation from intergalactic dust heated by surrounding galaxies:

        https://ui.adsabs.harvard.edu/abs/2018MNRAS.478..283V/abstract

        Pavel Kroupa brought this up in the following article where he states that the FLRW Lambda CDM model is falsified at 7 sigma:

        https://darkmattercrisis.wordpress.com/2020/11/10/the-crisis-in-cosmology-is-now-catastrophic/

        Liked by 1 person

        1. Okay, on the bright side: it’s very good to know that cosmic rays can also change abundance percentages of light elements. However, I cannot see any reason whatsoever in his explanation of the CMB. The explanation given by Vavrycuk is way better IMO but that also needs yet to fit the power spectrum. But for both researchers, the idea that the universe had no beginning is not something I like. I’d prefer that stars and planets were made one by one or galaxies one by one, without a hot dense gas phase for the entire universe up front.

          Like

          1. I’ve thought myself some time ago that the carbon dating on moon rocks could be misleading, since they’re subject to cosmic ray bombardments including protons with enough velocity to trigger fission and thus speed up the carbon decay.

            Like

    5. Yes, exactly. But people place great faith in it that it does, and the back of the envelope (GR) calculation says it should. So it’ll talk a whole lot of KBC voids to shake faith in the cosmological principle. We’ve already had many – the CfA stickman, the Sloan Great Wall, there is a similar structure at absurdly high redshift – people go “huh! that’s weird!” without letting it threaten their cognition. These discoveries happen far enough apart that each unexpected observation is subsumed into the norm, so the next is not seen as piling on but rather just some odd one-off.
      The CfA stickman was discovered in the CfA redshift survey in 1987. Surprised the heck out of us. A mere six years later, I mentioned that reaction in a discussion group at Cambridge. (This was before I had any doubts about CDM myself.) The grad students expressed great surprise at my statement, as they had been taught that structure formation explains everything so well. They actively disbelieved my report of a lived experience. (I continue to get that a lot.) So they appealed to the elder authority in the room, Simon White, who confirmed my account.
      You could see the younger folk struggling with the internal conflict this caused. Like the chap who got turned into a newt, they got better. That is, they subsumed it into their base of knowledge without ultimately letting it threaten their belief system.

      Like

  6. One could think of the origin as something like the Big Bang-Stop. A place of symmetry for dynamic and static models of our universe, and where the closer we look to the horizon, the slower those clocks appear to be ticking.

    If the equivalence principle holds, then it doesn’t matter if you put a black hole mass beyond the horizon, or have our observable universe expanding past it.

    Yet whether dynamic or static, no single model can fully explain all observable phenomena.

    Like

  7. “Bekenstein pursued for the rest of his life” If he’d lived a life of ordinary length rather than dying prematurely, he might have. It is always a tragedy when younger scholars like him die before mentors like Milgrom from an earlier generation.

    Liked by 1 person

  8. If the universe is infinite and redshift is an optical effect, then the universe would be bathed in this uniform radiation, from infinite sources, with some faint patterns of the sources just over the horizon of the visible spectrum. CMBR.

    I still don’t get it how mature galaxies could be explained having formed in a few hundred million years. The implication being that mature galaxies have, at the very least, second generation stars.

    Like

  9. I am just stating the obvious, but a mature galaxy could form in a few hundred million years if those observed “years” passed more slowly than local years. That is to say, I think we may need to show a suitable skewing of the time metric we are observing in order to allow for the same galaxy growth there as here.

    Like

    1. You do realize that makes no sense whatsoever?
      Time is not some underlaying fabric. It is a measure of the rate of activity, aka, frequency, so the only way for it to slow, would be the activity to slow.
      Now if you were to try to argue that time was much faster, that would make a little more sense, as the need is to explain how some billions of years of activity can occur what is now a few hundred million years. Yet evidence of that expanded time frame would have to speed everything else up and so we would need a universe that is the some billions of years older, required for galaxies to mature.
      My prediction is the next patch will be the edge of the universe is mirrored. I think Stacy described it as Z going to infinity, as the clock goes to zero.

      Like

      1. It’s the metric which we use that is in question, not the actual rate of events in their local frame. We think there should not have been enough time for the structure to develop, therefore our measure of time is too short. We would need to adjust our rulers (years) accordingly, however this doesn’t change the actual rate of events in the early universe. I guess this would not be obvious, but it doesn’t seem that complicated.

        Like

        1. Remember our presumed measure of the timeline isn’t something whispered by the gods. It emerges from our calculations of the presumed doppler effect causing the redshift.
          We like to apply these mental shortcuts, in order to make sense of all the information, but nature still works holistically.
          Obviously when working in large numbers of people, we like to think we have the power to make nature bend to our wants and needs, but then find ourselves way out on a limb.
          Even so, to “adjust our rulers,” to fit the actual course of events, presumably that would mean making them long enough to incorporate sufficient generations of stars to produce the heavy elements necessary to qualify as a mature galaxy.

          Like

          1. No need to bring up the gods. It is not that complicated. We are looking back at the early universe. In the context of the big bang, this is looking back to where we came from so to speak. The origin is much closer to proper time than we are now (in this paradigm we are accelerated observers).
            Looking back we see that the early universe has “aged” more than expected based on our time intervals. This is consistent with time dilation.

            Like

            1. Not only don’t I believe in the BBT, but physics’ modeling of time is a mess.
              As I keep pointing out, as these mobile organisms, this sentient interface our body has with its situation functions as a sequence of perceptions, in order to navigate. Sort of like a movie camera takes a sequence of stills.
              So our experience of time is as the point of the present moving past to future, but the evident physical reality is that action and the resulting change is turning future to past. Tomorrow becomes yesterday, because the earth turns.
              So the current block time, geometric model of time , with it as a fourth dimension of space, codified as measures of duration, is a bit like epicycles, try to model the cosmos turning east to west, when the reality is the earth turning west to east.
              There is no dimension of time, because the past is consumed by the present, to inform and drive it. Causality and conservation of energy. Cause becomes effect.
              Different clocks can run at different rates simply because they are separate actions. Think metabolism. That culture tries to harmonize society in a larger communal organism, using the same languages, rules and measures, it might seem like there should be some universal, Newtonian flow of time, but the fact is everything marches to the beat of their own drummer. Even if we try to synchronize society, the counterbalancing effect is the harmonization of the constituent energy breaking back down and radiating back out. Nodes and networks, organisms and ecosystems, particles and fields.
              That different events will appear in different order from different locations is no more consequential than seeing the moon as it was a moment ago, simultaneous with seeing stars as they were years ago. It is the energy that is “conserved” and thus present, not the information. The information changing is time.
              So time is an effect and measure of activity, similar to temperature, pressure, color and sound, not some extension of space. That is just a chronological mapping device.
              Frequencies and amplitudes, rates and degrees.
              Ideal gas laws correlate volume with temperature and pressure, which are as foundational to our emotions and bodily functions, as sequence is to thought, but we don’t assume them to be synonymous with space.
              Now consider the energy manifests this presence, so it goes past to future, because the patterns generated coalesce and dissolve, go future to past. Energy drives the wave, while the fluctuations rise and fall. Like consciousness goes past to future, while the perceptions giving it form and structure go future to past.
              Galaxies are also, in the most basic terms, energy radiating out, toward infinity, as structure coalesces in, toward equilibrium.
              So if you think through all that you might begin to see that time is this linear sequence of events, going from start to finish, like the earth is flat. Only to our subjective point of view.
              If you read the story of the development of the expanding universe theory, LeMaitre in particular, was determined to frame it as a “cosmic egg,” based on his religious convictions, given he was a Catholic priest.
              It has been my argument that a lot of old school ideas, such as atomism, really were incorporated in the current physical theories and we will continue to run in circles, until everything is on the table.
              Here is an interview with Carver Mead, someone with probably more hands on experience in the quantum realm than anyone, arguing it’s really waves and the particles are an artifact of the measuring devices;
              http://worrydream.com/refs/Mead%20-%20American%20Spectator%20Interview.html
              As for time and space dilation, the original physical basis was in accelerated frames, the measures of both slow equally, because combination of the motion of the frame and activity within it still can’t exceed C. I’m not sure how they managed to turn the math around to come up with this, but it involves significant amounts of mathematical faerie dust, along with a suspension of basic logic, given it still uses light speed as the metric.
              You are likely not going to think this through, but keep in mind all the patches already required to keep this theory going. Do you think more patches will eventually make a UT?

              Like

              1. I think you could probably come up with a more fundamental description of how organisms use time and perceive their environment if you really wanted to.
                If you pack too much info into any description of reality it starts to get metaphysical.

                Like

            2. Would a tree experience time as a sequence of events, or would it be cycles of expansion and consolidation? They don’t have that sentient interface, they are a stitch between sky and ground.
              We tend to think of it as a sequence of nows, like they all exist on that timeline, but the reality is the future is nearly infinite potential, the present is the tiny fraction that actually occurs and past is just the residue left, like ashes after a fire. So the plants cycles are actually a better analogy. The future is the sunlight pouring in, the present is the leaves waving about and the past is the tree rings and leaf litter left. Expand/consolidate.
              My brain works best metaphysically. More the sunlight, than the twigs on the ground.

              Like

              1. Good point, trees have their own sense of time. But the sense of time is built on time and memory. Time doesn’t just emerge after we sense our environment, it is fundamental to how we sense. The interface is the collection of stored samples from oscillators. We chose the period and direction of time before any sentient thoughts were formed.

                Like

            3. That collection of stored signals is memory, but it is in feedback with input, that’s why it’s an interface.
              Remember, the sentience goes past to future, like the energy, while the thoughts go future to past, like the fluctuations of the waves, rising and falling. So the sentience drives it, while the information directs and defines. Motor and steering.

              Like

  10. The mainstream media is currently reporting that the discovery of high redshift galaxies by JWST is “shattering the scientific understanding of age of the universe”. The newsreader for NBC news states that “well for one thing it could pretty much rewrite a whole bunch of astrophysics textbooks”.

    This is important because it indicates a growing awareness in society that the current standard model of cosmology is wrong.

    Liked by 2 people

    1. That’s why it is/was important to make predictions like that of Sanders. It’s not *just* that massive, high redshift galaxies are model-breakingly inexplicable in the standard model (I’m sure someone will fix that!) but *also* that it was exactly predicted by a different theory. I see people work on the first aspect all the time while carefully avoiding the second. It seems dishonest to me, like we’re denying a part of reality. When challenged, it provokes a variety of defensive responses that excuse ignoring the obvious.
      This is not a healthy way to do science, but perhaps it is an inevitable part of the bumpy path to the future.

      Liked by 1 person

      1. Stacy,
        If we were to assume that we are accelerated observers relative to the early universe, and that time dilation was necessary to be accounted for, could accounting for the time dilation reduce the Hubble tension?

        Like

        1. No, I don’t think so. Cosmology is inherently a relativistic subject; issues like this come built-in. The problem with the Hubble tension is more basic. We do what Hubble originally did, measure distances and redshifts of relatively nearby galaxies, and we get one answer. We fit a bunch of parameters (including H0) to the very distant CMB at z=1000, and we get another. Those things should be self-consistently described by GR, inclusive of all the usual dilation effects you hear discussed.

          Like

          1. The issue I am having is that if spacetime is found to be in general flat in LCDM, then we would only need to consider the time dilation of special relativity. This would lead us to conclude that the two Hubble Constant measures should be nearly the same. But they are not.
            An additional time dilation arises if we are accelerating away from everywhere in the early universe, or in an f(r) gravity field, but I think that would mean we should have never expected the local and nonlocal measures of H0 to look the same.

            Like

            1. AFAIK it’s not that we are accelerating relative to high redshift objects. As I understood the nature of “dark energy” the creation of extra spacetime in between these objects and us is accelerating. Otherwise beyond the observable universe we have faster than light travel.

              Like

              1. And here is a strange thought: if the big bang is everywhere on the horizon, and spacetime is accelerating away from every big bang, then wouldn’t every observer be effectively pinned in place by this expanding spacetime, causing an “invisible” acceleration? I mean the alternative is that the big bang new it had to move away from us, which frankly is even harder to believe.

                Like

          2. Dr. McGaugh,

            Cosmology is inherently a relativistic subject

            Cosmology certainly should be treated as a relativistic subject but the standard model has a universal frame – which is an inherently classical concept not a relativistic one. The current situation is attributable to a universal frame, the FLRW metric, being slapped onto GR with the helpful lubricant of mathematical convenience now known as the Cosmological Principle.

            The result speaks for itself. We have a 100 year old model “Universe” that bears no resemblance to the Cosmos we observe and as a bonus that “Universe” is as physically impossible under the known laws of physics as it is absurd. It would seem reasonable at this point for the cosmologists of the academy to entertain the possibility of considering other models. You would think?

            Like

      1. well MOND is not the only model with a correct prediction for the impossibly big galaxies discovered by JWST. Eric Lerner has a model which similarly predicts the impossibly big galaxies discovered by JWST, which does not modify gravity but rather rejects the notion of an expanding universe in favor of a non-expanding universe.

        Click to access Will_LCDM_survive_JWST.pdf

        Click to access Structure-2022-.pdf

        Click to access GOLE-Lerner.pdf

        What both have in common though is what Sabine Hossenfelder pointed out in her video, that science media isn’t really covering alternatives to the FLRW framework.

        Sabine Hossenfelder herself has worked on MOND and superfluid dark matter models with Tobias Mistele and Stacy McGaugh in the past few years, so I could see why she is in favour of MOND over other alternative models as an alternative to galaxy formation models in the falsified FLRW framework.

        Like

    2. …and in another corner of YouTube, Dr Becky (who doesn’t seem to much fancy having to question LCDM) says something revealing on the same topic: https://youtu.be/hmkyF1tNFc4?t=580
      At about 10:15, she says “…our simulations and models are fitted to the data we have. Now we have new data.” In context, I read this as an expectation that LCDM with all its currently added bells and whistles would be capable of being fitted to *any* data. Taken like that, it feeds into Stacy’s concerns about the “unfalsifyability” of LCDM.

      Like

  11. It seems to me that the situation in regard to the standard model of cosmology is less like that of Newtonian dynamics in regard to the UT of General Relativity than the relationship of geocentrism to heliocentrism. Heliocentrism was a replacement theory; you couldn’t get to heliocentrism logically by assuming geocentrism is valid and just seeking a replacement for epicycles.

    So while I whole-heartedly agree with this sentiment:

    So, painful as it may be, one has find a little humility to step back and take account of what we know empirically independent of the interpretive veneer of theory.

    I’m equally disappointed by this:

    As I’ve said before, I think we do know that the universe is expanding and passed through an early hot phase that bequeathed us the primordial abundances of the light elements (BBN) and the relic radiation field that we observe as the cosmic microwave background (CMB).

    There is no empirical evidence supporting the expanding universe model. There are empirical observations that the model can be fitted to, but not without a lot of existential-metaphysical baggage, which means those observations constitute, at best, inferential not empirical evidence. In that, the situation is little different than it was with Ptolemaic cosmology. The solution is likely to be similar. The foundational assumptions need to be retired.

    In fairness to Dr. McGaugh, he is laboring under a bizarre socio-economic regime that functions more like a cult of belief than an open-minded scientific community. It is not safe to openly question the cult’s core beliefs; this MOND apostasy is quite bad enough.

    However, as an old song had it, The times they are a-changin’. In the long run science corrects its mistakes. That’s one of the things that distinguishes real science from pseudo-scientific belief systems.

    Liked by 1 person

    1. Exactly.

      They’re no sacred cows in Science, only empirical evidence reigns supreme.

      Having a copy of Reality would be the ultimate theory of everything or a truly universal model.

      Lacking that the best that we can do is to have lots and lots of telemetric data from all aspects of reality (as engineers do when building prototypes) and some statistical, mathematical models that synthesize the always changing telemetric data.

      But nothing can replace the continuous and growing collection of data, that is ultimately our collective perception of reality.

      Knowing (data) takes precedence over understanding (modeling).

      Like

    2. “There is no empirical evidence supporting the expanding universe model.”

      I recently found this paper published in 2018 in the Monthly Notices of the Royal Astronomical Society which states that the existing empirical evidence from galaxy sizes and from the Tolman surface brightness test actually contradicts an expanding universe:

      https://academic.oup.com/mnras/article/477/3/3185/4951333

      So you could now make the stronger case that the entire big bang paradigm is falsified.

      Like

  12. I think Stacy might make some serious progress here with what I assume is his rebranding of MOND into the more compelling MOND-UT.

    Like

  13. Stacy,
    Why couldn’t the universal acceleration constant of MOND be attributed to the radial acceleration of the observer?

    Like

    1. It shows up in the radial acceleration of an observed object relative to an obvious center, e.g., of a galaxy. One corrects out the relative motion of the observer as a matter of course.

      Like

      1. Yes, that does make sense. I meant to ask more specifically about the possibility of an observer experiencing a radial acceleration with respect to origins of the observable universe.

        Currently I am lead to believe that we are comoving with respect to the expansion of the universe, so there is no need to account for our acceleration relative to the early universe as a whole. That is the notion I am really questioning here.

        Seems like if FLRW is not the right metric, then maybe every observer is accelerating, instead of only comoving, w.r.t the early universe. That would be true for an observer anywhere in space, independent of local fields or local motion.

        Like

      2. The matter of course one follows depends on the metric and the underlying assumptions. FLRW and Cosmological Principle seems to me like it is only giving us half the story (and half may be too generous considering we are now 95% in the Dark).
        The half story we get IMO is due to the assertion of homogeneity from the Cosmological Principle; then given the FLRW metric and LCDM, I think we believe that the center of mass of this observable FLAT universe is coincident with the oberserver.

        But if the universe were isotropic and inhomogeneous, we would need a metric other than FLRW.

        Let’s go beyond that for a minute and consider that we may presently need a principle of Cosmological Complementarity. In other words, let’s not default to saying decades of progress is “wrong”, but rather “incomplete”. Now maybe the UT has room for two metrics? Incidently, BIMOND apparently does.

        Like

        1. It occurs to me that Torsion is a representation of said Cosmological Complementarity. Perhaps time will tell which of these is more generalized.

          Like

  14. The Wikipedia entry on the 4-velocity contains the statement,

    “Physical events correspond to mathematical points in time and space, the set of all of them together forming a mathematical model of physical four-dimensional spacetime. ”

    https://en.m.wikipedia.org/wiki/Four-velocity

    This is a very problematic statement.

    However one may feel about the first-order paradigm in the foundations of mathematics, one of its features is the denial of the principle of the identity of indiscenibles as a logical principle.

    The first-order paradigm would certainly admit these points as ontological atoms. However, in so far as the paradigm claims to base its semantics on Tarski’s semantic conception of truth, there is the the problem that a semantic conception of truth cannot justify the necessary truth of reflexive equality statements.

    If Pegasus does not exist,

    Pegasus=Pegasus

    cannot have a truth as a truth valuation. This is obfuscated in the philosophy of formal systems because the meaningfulness of expressions is ensured through formation rules.

    The denial of the identity of indiscernibles has consequences which are not unlike the establishment of the speed of light as a constant of physics. In the case of physics, the issue becomes the conventionality of simultaneity,

    https://plato.stanford.edu/entries/spacetime-convensimul/

    https://en.m.wikipedia.org/wiki/One-way_speed_of_light

    In the denial of the identity of indiscernibles, one is confronted with Black’s symmetric universe,

    https://en.m.wikipedia.org/wiki/Max_Black

    Click to access blacksballs.pdf

    If one looks at the work of Leibniz, the logical application of the principle is related to Aristotle’s assertion that genera are prior to species. The geometric application, however, is the observation that “space” is comprised of different “vantage points” distinguished from one another only with respect to the “witnessed view” from each point. So, “space” is relational (a view rejected by Newton).

    The first problem with the statement above from the Wikipedia article is that it extends this relational view (grounded in empirical witnessing) to an unwitnessable dimension.

    So what?

    If I prepare a quantum experiment, the outcome of any instance of the experiment is in the past time conic of the the generating component. But, the entire experiment is in my past time conic.

    The “ontic reality” of different world lines imposes “epistemic uncertainty” upon my past time conic.

    I simply cannot witness the time evolution from the event associated with the generating component of the apparatus.

    If I am correct in this view, then the time parameter of a Shroedinger equation is only interpretable relative to my peculiar reference frame.

    I do not know enough to speak for every quantum apparatus. I do know that the trigonometric functions are related to undecidability in the theory of real numbers. Using Cartesian products to augment reals for a theory of infinitesimals does nothing to resolve this. However, the statistics of Stern-Gerlach experiments relates to cosine values (hence, inner products and orthogonality).

    The second problem with the Wikipedia statement is related to the utility of intrinsic curvature from differential geometry for characterizing a mass shell for general relativity.

    For many years I thought about the relationship between single-point compactifications of Tychonoff spaces and the linguistic expression, “the Big Bang.” The inability of first-order logic to characterize finiteness, after all, is related to compactness.

    However, by virtue of intrinsic curvature, the implicit metric structure associated with the identity of indiscernibles means that the mathematical structure is sequentially compact. By a theorem in Kantorovich, the statements,

    X is compact

    X is sequentially compact

    X is countably compact

    X is complete and for each epsilon > 0 there exists a finite epsilon-net for X in X

    These compactness conditions then admit considerations of extensions to the space. Being Hausdorff alone does not admit extensions of properties. The theorem from Porter and Wood,

    A space X is H-closed and regular if and only if X is compact.

    is the theorem that admits extensions. That is, H-closed spaces admit extensions.

    One mathematician who strenuously argues for denying ontological import to “virtual particles” is Arnold Neumaier. The idea of virtual particles arises from Feynman diagrams. Neumaier is strict concerning this situation because virtual particles are off the mass shell.

    What I am suggesting here is that “the reality” of virtual particles in a “quantum reality” is associated with ascribing mathematical realism to the description from the Wikipedia article. These “real particles” are from the extensions to the H-closed space.

    I may be wrong about this mathematical hallucination. Let me note, however, that significant numbers of researchers in first-order set theory are remiss to admit large cardinal axioms that grant “consistency” to mathematics without charge.

    MacLane’s approach to category theory admits this consistency. Grothendieck universes admit this consistency. Martin-Lof type theory is verificationalist. Homotopy theory incorporates the identity of indiscernibles because contractibility ignores the distinction between open and closed sets in a topology.

    The only way to have the perspective suggested here is to respect the orthodox interpretation of Goedel incompleteness as differentiating truth from provability while dismissing the inference rules of first-order logic without advocating for the alternatives.

    Given that mathematicians cannot agree on a univocal interpretation of the word “mathematics,” what could this staement,

    “Physical events correspond to mathematical points in time and space, the set of all of them together forming a mathematical model of physical four-dimensional spacetime. ”

    possibly mean?

    Like

  15. It appears that Eric Lerner and Riccardo Scarpa are both also supporters of MOND, in addition to being anti-expanding universe. From their 2021 article titled “Will LCDM cosmology survive the James Webb Space Telescope?” they write:

    “Our main conclusion is that the JWST will fail to probe a pre-stellar Dark Age, which, if it exists at all, lies extremely far in the past. Hopefully, this will force the development of a new vision of the Universe, which of course doesn’t necessarily have to be the one put forward here, forcing a revolution in astronomy and fundamental physics. A number of indicators suggest this revolution is already happening. It is indeed well demonstrated that the dynamical properties of galaxies in the local Universe do show systematic features – the most striking possibly being the mass Discrepancy – acceleration relation ( [18]) – whihc are better explained by modifying gravity than invoking dark matter, as proposed in the modified Newtonian dynamics (MOND, [21]; [20]). At present, however, disposing of dark matter and expansion of the universe is unthinkable, for the whole LCDM paradigm would fall apart. A new set of observations is necessary to give the final push. If the JWST will provide it as we are suggesting here, a serious discussion of the basic concept of the expansion of the Universe and the nature of the CMB will start (a discussion that is now impossible). In physics, a new explanation for the redshift will have to be found, the whole concept of a dark sector abandoned, and gravity in the low acceleration regime rewritten.”

    Click to access Will_LCDM_survive_JWST.pdf

    Liked by 1 person

    1. It’s always good to be reminded that a set of empirical evidence could have multiple “interpretations”, otherwise mainstream interpretations become undistinguishable from religion/cult.

      Like

  16. The technical papers referred to by Stacy I found to be “heavy duty” like in the sense of the first “Back to the Future” movie, where Marty is astounded by the technological achievement of Doc Brown’s DeLorean time machine. The first paper cited – Milgrom’s relativistic BIMOND theory is particularly interesting since it covers gravitational lensing, and I’m going to try my best to understand it at a mathematical-physical level. And, Stacy’s call for “wild west” theories is inspiring, as like many other interested non-scientists I have a rather non-conventional theory that perhaps could account for dark matter/energy. It conforms to the Standard Model of particle physics except for the imputation that the wavelike property of matter is a consequence of a real field that is explicitly identified, and a slight modification to the neutrino sector.

    Like

  17. Dr. McGaugh,

    I am aware that my comments are not well understood. I have studied the foundations of mathematics for 35 years with an emphasis on how “logic” breaks science. More specifically, I began by asking whether or not “logic” could be the foundation for mathematics at all.

    The reason that trigonometric functions become an issue lies with work by Alfred Tarski showing that the theory of closed real fields is decidable. One can make partial extensions with analytic functions, but the unrestricted use of trigonometric functions results in an undecidable theory.

    The typical 2-dimensional rotation matrix used in physics relies upon the identity,

    x^2 + y^2 = 1

    which can be scaled to any positive radius. It cannot be applied for a zero radius. Consequently, rotational inertia has a fundamental relationship to areas.

    Areas, of course, are also associated with interpretations of integration. And, integration expresses the relationship between force and potential energy.

    The notion of locality in physics is related to metric axioms since locality is the rejection of action at a distance. But, the standard axiomatizations for a metric express the indivisibility of a point as a zero distance.

    So, metrics appear to break our understanding of inertia.

    First-order logicians attempt to accommodate differential mathematics using Cartesian products to “stuff” ideal points around (formalist/field-theoretic) real numbers. This is the proclaimed non-standard analysis. But, the fact that such attempts also yield closed real fields means that rotational dynamics is not recovered by such means.

    Bahram Mashhoon appears to have captured this difficulty for relativistic physics by introducing what I shall call “microaccelerations” into the analysis of relativity. This introduces integral factors into his equations. Hence, it accommodates rotational accelerations.

    I certainly cannot judge his technical work or whether or not his physical arguments will be convincing to other physicists, but, the idea seems to address a fundamental problem from the classical perspective.

    His arXiv listing can be found at,

    https://arxiv.org/search/gr-qc?searchtype=author&query=Mashhoon%2C+B

    Like

    1. More generally, all concepts of continuous functions, differentiation and integration, and so on, depend on murky depths of mathematics, logic and philosophy that physicists and astronomers (for good reasons) are not interested in. All these concepts are used only as approximations to describe physical reality, and work very well in most contexts. But for more than a century it has been abundantly clear that physical reality in *not* continuous, and that eventually we will run into the problem that continuous mathematics is not up to the job of describing physical reality in sufficient detail. The fact that this has now happened is no surprise. Nor is the fact that physicists are in complete denial, since they (mostly) cannot conceive of any mathematics that is not based on differential equations.

      Like

      1. No, it’s a non sequitur. Quantized states don’t imply noncontinuous spacetime at all. The underlying quantum theories are QFT, which are entirely continuous and based on Lie groups. The whole idea of Lie groups is that such a group is continuous and works with infinitesimal values to give a Lie Algebra. Of course you can “quantize” infinitesimals and take it as an approximation, and you might be right. But continuous reality is still entirely possible.

        If you discard infinitesimals, infinity may well be next. And without infinity, who would come up with renormalization? IMO it’s a direction that discards progress already made. Somewhat alike to how the String Theory “landscape” discards order to be the fundamental basis of reality in order to “explain” finetuning.

        Like

        1. Excuse me, but I think it is your argument that is a non-sequitur, not mine. You argue from theory, I argue from experiment. I discard infinitesimals, and infinity, and renormalisation. They are all mathematical conveniences, and have no basis in physical reality whatsoever.

          Like

          1. I’m not asking you to consider these continuous / infinity concepts as real, but pointing to how thinking with them pragmatically led to progress. Who knows what really is real? I can only believe, not give a definite answer.

            Like

        2. The measurement problem of QM can without too much assumptions be resolved as entanglement with the (comparably macro) state measurement apparatus. IMO that suggests that the (continuously defined) wavefunctions are the actual world, and classical mechanics just becomes the emergent behaviour of many wavefunctions. True, the energy states and most of the quantum properties are quantized. But these arise as eigenvalues of operators that define a continuous process, both in inputs and outputs.

          Like

          1. I do not know how to respond to these two apparently contradictory statements. Here you assert that the “actual world” is continuous, and four minutes later you say I do not have to consider these “continuous/infinity concepts as real”. You can’t have it both ways.

            Like

          2. Being of the opinion that something is true is not the same as asking others to believe that it is true. Your opinion is welcome regardless of how it lines up with my own beliefs. I do caution against what I see as discarding progress, but my definition of progress might not be yours either. And to go in the direction of progress, some go with a large detour.

            Like

      2. @Dr. Wilson,

        I presume that your remarks about continuity are addressed to my statements about metrics and inertia.

        As a professional mathematician you most certainly understand that continuity is a topological concept that applies equally well to both finite and infinite sets. So, you are either using a word incorrectly or you are projecting your beliefs without the ability to justify them.

        Whether or not spatial points, temporal moments, or spatiotemporal events do or do not serve as an objectual grounding is not a premise that can be known through empirical witnessing. People who wish to identify science with truths are confronted with this (apparently) unknowable situation.

        It is interesting to look at what actually happens in the development of real analysis. Riemann integration cannot be satisfactorially extended to situations relevant to uses in physics. So, the Lesbesgue integral is formulated. Lesbesgue integration leads to the study of measure theory. Borel measures are the result of building a measure over a topology. Lesbesgue measures are the completions of Borel measures.

        Notice that the separation properties of point-set topology are obfuscated by this notion of measure completion.

        Continuing, Lesbesgue integrals defined in terms of a Lesbesgue measure can evaluate to the same numerical value except on a set of measure zero.

        So, what is done for the sake of “calculation” using “algebra”? Lesbesgue integrals are “collected” into “equivalence classes”.

        Why?

        So that integrals can fall under the definition of a normed linear space.

        There is a difference between algebraic topology and point-set topology. Because point-set topology respects the distinction of open and closed sets, there is no “contractible to a point.”

        Cantor’s theorem (regarding formalist/field-theoretic real numbers) for closed sets of vanishing diameter is a theorem about *closed* sets — not contractible regions.

        Again, one has algebraists catering to the use of calculation.

        The problem is that area cannot be linearized.

        This argument has gone on since the priority debate between Leibniz and Newton (if not longer). As a dubious celebrity psychologist asks, “How’s that working for you?”

        The “path to finiteness” through point-set topology is Kutatowski’s 14-set problem. There are other investigations of similar character, but the 14-set problem can be related to finite geometries and switching functions.

        Within the 16-set of basic Boolean functions, there are a pair of 14+2 partitions related to mathematically significant topics. Obviously, the two constant functions may serve as proxies for ‘true’ and ‘false’. So, that is one partition.

        The other comes from the fact that two of the 16 basic Boolean functions are not linearly separable. And, not so coincidentally, the two which are not linearly separable happen to be the biconditional and the exclusive disjunction.

        As is well known, the biconditional underlies the equivalence classes which form the Tarski-Lindenbaum algebra.

        But, the “reasoning principle” described as “the unity of opposites” can be implemented with an exclusive disjunction.

        And, the mathematics of quantum gravity often leads to Kummer surfaces with 16 exceptional points that form a Kummer configuration. The symmetries of that configuration is described in terms of 6-sets. The smallest finite closure spaces are 6-sets. The smallest finite affine geometry is over a 6-set.

        One can populate a Rook’s graph with symbols to extend each 6-set to a 7-set. The smallest topology is a 7-set. The smallest projective geometry is over a 7-set.

        It is becoming common for physicists too look beyond linearity because it is failing them.

        Quantum theorists are looking at deformed algebras as a means to introduce nonlinearity into the uncertainty principle. As I just discussed, Mashhoon’s work introduces nonlinearity into the analyses of relativity with “microaccelerations” (my expression). And, if you read the papers by Milgrom to which we have been directed by Dr. McGaugh, Milgrom describes BIMOND as a means of introducing nonlinearity.

        The fact that BIMOND is based upon the idea of using two metrics as a ground for the metric tensor is clearly a form of “gluing” comparable with the unity of opposites. It would be more precise, however, to speak of it as an application of differential ontology made necessary because relativistic cosmology requires intrinsic curvature.

        One of the problems involved here is that the notion of “quantum spin” is incompatible with uses of vector spaces (which are ontologically preceded by groups). The SEP article on Bohmian mechanics clearly states this:

        “Rather the problem is that there is no ordinary (nonquantum) quantity which, like the spin observable, is a 3-vector and which also is such that its components in all possible directions belong to the same discrete set. The problem, in other words, is that the usual vector relationships among the various components of the spin vector are incompatible with the quantization conditions on the values of these components.”

        Section 11 is the section on spin,

        https://plato.stanford.edu/entries/qm-bohm/#Spin

        Using a Wikipedia illustration, the problem with approaches based upon group theory becomes obvious,

        https://en.m.wikipedia.org/wiki/File:Magma_to_group4.svg

        The study of divisibilty (the thing accelerators invesigate with high energies) goes through quasigroups rather than monoids. And, both are “more primitive” than the groups upon which you build your vector spaces.

        For the sake of terminology, call the orthodox conception of an ontology an “objectual ontology” (points, moments, and events have material existence, or are “substantival”). Contrast this with “differential ontology” (points, moments, and events need not have material existence — that is, they are fictions).

        The unity of opposites is associated with differential ontology. In a more modern guise it is associated with paraconsistent logic as can be seen in the title of the paper at the link,

        https://link.springer.com/article/10.1007/BF02379118

        Finally, for what this is worth, to the best that I can tell, all mention of “independent variables” in algebra assume the notion of linear independence from linear algebra. The algebraization of first-order logic admits comparison with dimensionality for first-order variables. Goedel’s completeness theorem is a reduction from first-order logic to propositional logic by the use of Henkin constants. And, propositional models are comparable to Cantor spaces over two symbols. So, the fact that algebraization leads one to recognize how the vast expanse of “algebra” depends upon linear independence (and, thus, orthogonality) is not coincidental.

        The “pre-established harmony” of the mathematicist is realized as the statistical limit of Stern-Gerlach experiments. Those limits are correlated with cosines, and, cosines are correlated with orthogonality between independent dimensions.

        I may not be able to calculate with a Lagrangian or a Hamiltonian, Dr. Wilson, but, I have spent 35 years studying a Hilbert problem. And, for much of the last 8 years, I have been asking why probability and measure have an relation whatsoever to the continuum problem:

        https://en.m.wikipedia.org/wiki/Freiling%27s_axiom_of_symmetry

        I know a great deal of mathematics — both discrete and indiscrete. I also know what I do not know: whether or not points, moments, or events have a material, substantive existence.

        If you wish to actually prove your claim, I am all ears.

        Like

        1. I am afraid I am unable to decrypt your message. Technically you are correct to say that “continuity”, as defined by topologists, can be applied to finite sets. But that is not how physicists use the term, so it is completely irrelevant to this discussion. The only topology that is relevant here is the discrete topology, which is equivalent to saying there is no topology.

          Like

  18. Energy goes past to future because the information it’s generating goes future to past.
    Energy drives the wave, the fluctuations rise and fall.
    Consciousness goes past to future, because the perceptions, emotions and thoughts giving it form and structure go future to past.
    When all the theories are information based, they spiral into innumerable rabbit holes.
    It’s safe to say, the only circuit breakers to these feedback loops are funerals.

    Like

    1. Hmmm, interesting. If that is true . . . . . then I guess the big question is, “Did the rabbit hole we live in originally start out as a wormhole?”

      Like

      1. It was certainly the hope it would turn out to be one, but it seems the black hole at the center is where they all end up.
        Equilibrium and infinity.
        Black holes and black body radiation.

        Like

      2. Actually the big question is why physics can’t explain emergence, or even try to.

        Yes, how say, the political polarity of conservative and liberal might eventually emerge from quantum mechanics, does seem nearly impossible, given the degree to which all the intervening stages have evolved into their own fields/rabbit holes, but consider that there are patterns running through everything, so would some field see fit to try and figure out what are the more basic and universal patterns and whether they might be useful in disentangling some of the endless feedback loops of debate that grow up in every field, like so much kudzu. Or would that be too difficult?
        How about cycles of expansion and consolidation, like the seasons? Energy radiating out, as structure coalesces in, like galaxies. Social energies bursting forth as the motivating force of society, versus the structures of tradition, culture and civilization giving it form and substance. The appetites driving us, versus the decisions and judgements directing our actions.
        Yet obviously this doesn’t apply to quantum theory, where everything, force, mass, space, time, etc. is quantized. All structure. The driving dynamics are neutered as further quanta.
        Then we wonder why there is this gap between this quantum realm and the macroscopic reality in which the conservative and liberal elements of society are at odds.
        Polarities rule.

        Like

        1. “why physics can’t explain emergence?”

          Nothing can “explain” emergence exactly as the axioms of Rational numbers can’t explain Real numbers, or the axioms Euclidean geometry can’t explain non Euclidean geometries.

          Discreetness introduce incompleteness/irreducibly, and that irreducibility is pervasive, common. Chaitin already showed that by expanding Godel incompleteness theorems.

          The underlying wrong assumption in physics is to assume that complex systems, with very large discrete elements, will follow a behavior that can be reduced to the properties or behavior of their discrete elements.

          Condensed matter Physics is one example, and galaxies rotational speed is another, the artificial introduction of dark matter is a textbook example showing that complex systems may exhibit new emergent properties (MOND dynamics), and MOND failing at galaxy clusters level.

          As a matter of fact there is not a single theory that can be applied to multiple complexity levels without introducing, defacto, new irreducible properties obtained by experimental/observational data.

          Quantum Mechanics and “General” Relativity fail at higher complexity levels.

          Like

          1. You do very much make my point. That our current models are not up to the task.
            Yet would you agree that on the most basic level, galaxies are energy radiating out, as structure coalesces in?
            Then that as these biological organisms, we have a digestive system processing the energy driving us on, along with a central nervous system to sort through the structure of the patterns and information precipitating out. Basically motor and steering. With the circulation system as feedback between the two.
            Then that economics takes energy and turns it into structure, driving society on, while government functions as a form of social central nervous system, trying to organize all the various inputs and outputs, while money and banking work as a form of blood and circulation system.
            And for time, cause becomes effect. The energy goes to the future, while the patterns generated go to the past.
            As in a factory, the product goes start to finish, future to past, while the production line consumes energy and material and expelling product, going past to future.
            As lives go birth to death, future to past, while life moves onto the next generation, shedding the old, past to future.
            Do you see a pattern here?
            Or is your mind too wrapped up in all the patterns it has collected, to see anything differently?

            Like

            1. No model will ever be up to the task, they always be intrinsically limited because the axiomatic method is intrinsically limited, and everything done in Physics is rooted in the axiomatic method.

              Once again: Discreetness introduce incompleteness, irreducibility, and since Reality is hierarchical, with levels of complexity formed by grouping of multiple discreet components there is nothing that we can do to avoid that but accept it.

              A theory/model will work well at one level of complexity but it will fail in the next one because new irreducible properties will appear at that new level.

              Looking for a universal model/theory is an exercise in futility.

              Reality can’t care less about humans wishful thinking.

              Like

              1. You did not address my observation about the relationship between energy and information, just stated your own views.
                Yes, reality is extremely complex. I live in it as well. Yet consider biological complexity, where it grows through cycles of genetic expansion(driven by energy resources) and consolidation(structural ordering/natural selection).
                Which cannot be reverse engineered, given all the information irrevocably lost. So, yes, we cannot describe how specificities emerge, but is there some basic dynamic driving this process?
                Can you prove there is no basic process, or simply assert it can never be explored?

                Like

              2. The way to “explore” it is by direct observation/experiment, there is no way around it.

                Trying to “reduce” everything to simple principles is not going to work, because again that approach is intrinsically limited.

                The properties of higher complexity levels can’t be “discovered” by theoretical musings only by direct observations.

                Our theoretical preconceptions/assumptions are always limited. Nothing can replace Reality constant observation/proving.

                Like

              3. The point is not really to explore every detail, but to propose some basic dynamic.
                You still won’t comment on my very basic observations.
                Will you argue that cause does not become effect, but that all events exist out on some, “fourth dimension?” In which we can time travel through wormholes in the “fabric of spacetime,” given the proper applications of mathematical faerie dust.
                What is more “direct observation,” than observing that, “Tomorrow becomes yesterday, because the earth turns?
                That our state of consciousness goes from prior thoughts to succeeding ones, past to future, as these mental structure rise and fall, future to past?
                Given the elements of energy, information and time pretty much run through all these complexities of reality, why would it not be possible to relate them?
                How even complexity is emergent from the dynamic feedback loops.

                Like

Comments are closed.