Avi Loeb has a nice recent post Recalculating Academia, in which he discusses some of the issues confronting modern academia. One of the reasons I haven’t written here for a couple of months is despondency over the same problems. If you’re here reading this, you’ll likely be interested in what he has to say.
I am not eager to write at length today, but I do want to amplify some of the examples he gives with my own experience. For example, he notes that there are
theoretical physicists who avoid the guillotine of empirical tests for half a century by dedicating their career to abstract conjectures, avoid the risk of being proven wrong while demonstrating mathematical virtuosity.Avi Loeb
I recognize many kinds of theoretical physicists who fit this description. My first thought was string theory, which took off in the mid-80s when I was a grad student at Princeton, ground zero for that movement in the US. (The Russians indulged in this independently.) I remember a colloquium in which David Gross advocated the “theory of everything” with gratuitous religious fervor to a large audience of eager listeners quavering with anticipation with the texture of religious revelation. It was captivating and convincing, up until the point near the end when he noted that experimental tests were many orders of magnitude beyond any experiment conceivable at the time. That… wasn’t physics to me. If this was the path the field was going down, I wanted no part of it. This was one of many factors that precipitated my departure from the toxic sludge that was grad student life in the Princeton physics department.
I wish I could say I had been proven wrong. Instead, decades later, physics has nothing to show for its embrace of string theory. There have been some impressive development in mathematics stemming from it. Mathematics, not physics. And yet, there persists a large community of theoretical physicists who wander endlessly in the barren and practically infinite parameter space of multidimensional string theory. Maybe there is something relevant to physical reality there, or maybe it hasn’t been found because there isn’t. At what point does one admit that the objective being sought just ain’t there? [Death. For many people, the answer seems to be never. They keep repeating the same fruitless endeavor until they die.]
We do have new physics, in the form of massive neutrinos and the dark matter problem and the apparent acceleration of the expansion rate of the universe. What we don’t have is the expected evidence for supersymmetry, the crazy-bold yet comparatively humble first step on the road to string theory. If they had got even this much right, we should have seen evidence for it at the LHC, for example in the decay of the aptly named BS meson. If supersymmetric particles existed, they should provide many options for the meson to decay into, which otherwise has few options in the Standard Model of particle physics. This was a strong prediction of minimal supersymmetry, so much so that it was called the Golden Test of supersymmetry. After hearing this over and over in the ’80s and ’90s, I have not heard it again any time in this century. I’m nor sure when the theorists stopped talking about this embarrassment, but I suspect it is long enough ago now that it will come as a surprise to younger scientists, even those who work in the field. Supersymmetry flunked the golden test, and it flunked it hard. Rather than abandon the theory (some did), we just stopped talking about. There persists a large community of theorists who take supersymmetry for granted, and react with hostility if you question that Obvious Truth. They will tell you with condescension that only minimal supersymmetry is ruled out; there is an enormous parameter space still open for their imaginations to run wild, unbridled by experimental constraint. This is both true and pathetic.
Reading about the history of physics, I learned that there was a community of physicists who persisted believing in aether for decades after the Michelson-Morley experiment. After all, only some forms of aether were ruled out. This was true, at the time, but we don’t bother with that detail when teaching physics now. Instead, it gets streamlined to “aether was falsified by Michelson-Morley.” This is, in retrospect, true, and we don’t bother to mention those who pathetically kept after it.
The standard candidate for dark matter, the WIMP, is a supersymmetric particle. If supersymmetry is wrong, WIMPs don’t exist. And yet, there is a large community of particle physicists who persist in building ever bigger and better experiments designed to detect WIMPs. Funny enough, they haven’t detected anything. It was a good hypothesis, 38 years ago. Now its just a bad habit. The better ones tacitly acknowledge this, attributing their continuing efforts to the streetlight effect: you look where you can see.
Prof. Loeb offers another pertinent example:
When I ask graduating students at their thesis exam whether the cold dark matter paradigm will be proven wrong if their computer simulations will be in conflict with future data, they almost always say that any disagreement will indicate that they should add a missing ingredient to their theoretical model in order to “fix” the discrepancy.Avi Loeb
This is indeed the attitude. So much so that no additional ingredient seems to absurd if it is what we need to save the phenomenon. Feedback is the obvious example in my own field, as that (or the synonyms “baryon physics” or “gastrophysics”) is invoked to explain away any and all discrepancies. It sounds simple, since feedback is a real effect that does happen, but this single word does a lot of complicated work under the hood. There are many distinct kinds of feedback: stellar winds, UV radiation from massive stars, supernova when those stars explode, X-rays from compact sources like neutron stars, and relativistic jets from supermasive black holes at the centers of galactic nuclei. These are the examples of feedback that I can think of off the top of my head, there are probably more. All of these things have perceptible, real-world effects on the relevant scales, with, for example, stars blowing apart the dust and gas of their stellar cocoons after they form. This very real process has bugger all to do with what feedback is invoked to do on galactic scales. Usually, supernova are blamed by theorists for any and all problems in dwarf galaxies, while observers tell me that stellar winds do most of the work in disrupting star forming regions. Confronted with this apparent discrepancy, the usual answer is that it doesn’t matter how the energy is input into the interstellar medium, just that it is. Yet we can see profound differences between stellar winds and supernova explosions, so this does not inspire confidence for the predictive power of theories that generically invoke feedback to explain away problems that wouldn’t be there in a healthy theory.
This started a long time ago. I had already lost patience with this unscientific attitude to the point that I dubbed it the
Spergel Principle: “It is better to postdict than to predict.”McGaugh 1998
This continues to go on and has now done so for so long that generations of students seem to think that this is how science is supposed to be done. If asked about hypothesis testing and whether a theory can be falsified, many theorists will first look mystified, then act put out. Why would you even ask that? (One does not question the paradigm.) The minority of better ones then rally to come up with some reason to justify that yes, what they’re talking about can be falsified, so it does qualify as physics. But those goalposts can always be moved.
A good example of moving goalposts is the cusp-core problem. When I first encountered this in the mid to late ’90s, I tried to figure a way out of it, but failed. So I consulted one of the very best theorists, Simon White. When I asked him what he thought would constitute a falsification of cold dark matter, he said cusps: “cusps have to be there” [in the center of a dark matter halo]. Flash forward to today, when nobody would accept that as a falsification of cold dark matter: it can be fixed by feedback. Which would be fine, if it were true, which isn’t really clear. At best it provides a post facto explanation for an unpredicted phenomenon without addressing the underlying root cause, that the baryon distribution is predictive of the dynamics.
This is like putting a band-aid on a Tyrannosaurus. It’s already dead and fossilized. And if it isn’t, well, you got bigger problems.
Another disease common to theory is avoidance. A problem is first ignored, then the data are blamed for showing the wrong thing, then they are explained in a way that may or may not be satisfactory. Either way, it is treated as something that had been expected all along.
In a parallel to this gaslighting, I’ve noticed that it has become fashionable of late to describe unsatisfactory explanations as “natural.” Saying that something can be explained naturally is a powerful argument in science. The traditional meaning is that ok, we hadn’t contemplated this phenomena before it surprised us, but if we sit down and work it out, it makes sense. The “making sense” part means that an answer falls out of a theory easily when the right question is posed. If you need to run gazillions of supercomputer CPU hours of a simulation with a bunch of knobs for feedback to get something that sorta kinda approximates reality but not really, your result does not qualify as natural. It might be right – that’s a more involved adjudication – but it doesn’t qualify as natural and the current fad to abuse this term again does not inspire confidence that the results of such simulations might somehow be right. Just makes me suspect the theorists are fooling themselves.
I haven’t even talked about astroparticle physicists or those who engage in fantasies about the multiverse. I’ll just close by noting that Popper’s criterion for falsification was intended to distinguish between physics and metaphysics. That’s not the same as right or wrong, but physics is subject to experimental test while metaphysics is the stuff of late night bull sessions. The multiverse is manifestly metaphysical. Cool to think about, has lots of implications for philosophy and religion, but not physics. Even Gross has warned against treading down the garden path of the multiverse. (Tell me that you’re warning others not to make the same mistakes you made without admitting you made mistakes.)
There are a lot of scientists who would like to do away with Popper, or any requirement that physics be testable. These are inevitably the same people whose fancy turns to metascapes of mathematically beautiful if fruitless theories, and want to pass off their metaphysical ramblings as real physics. Don’t buy it.
33 thoughts on “Some Outsider Perspective from Insiders”
Nice and true post. Just one comment: The Michelson-Morley experiment does not rule out the aether if particles are treated also as waves; the effects cancel out because the measurement apparatus is also Doppler shifted (length contracts or extends):
Professor Kroupa your link does not work.
I take the view that the Michelson-Morley experiment was misinterpreted by Michelson and Lorentz from the start and the classical theory of the experiment is wrong. There has been more than one published paper trying to home in on such a possibility, but if you reconstruct physics from such a basic level quite a lot else has to be reinterpreted as well.
But when you tell students “to shut up and calculate” for than two generations, you end up with a fine tuned calculating machine, equivalent to a system of circles and epicycles with arbitrary numbers of free parameters,and with equally dubious metaphysical foundations.
In 11 days, the James Webb will be providing a lot of information. My prediction that as these very distant galaxies are resolved more clearly, they will show heavier elements and that even more distant galaxies will be observed, rather than some primordial haze. If that occurs, it will be throwing a large monkey wrench into even some of the more acceptable aspects of modern physics, since the only way redshift is optical is if these observed quanta are actually a sampling of the wave front and not individual photons traveling billions of lightyears.
Let’s allow that your prediction turns out to be correct. In that case I predict that the expanding universe model will simply be rejiggered to accommodate the new data; the early universe stage will be pushed out beyond the range of Webb and the whole paradigm will survive another generation. As the French have it, the more things change, the more they stay the same. That’s certainly the case in theoretical physics where foundational assumptions, devoid of any empirical evidence, have persisted long beyond their shelf-life.
Breaking News; Cosmologists announce the Jame Webb Space Telescope has discovered the edge of the universe is mirrored.
This has created the illusion of an infinite universe. Developing.
Though I suspect when those already observed, very distant galaxies, currently described as very young and mostly hydrogen and helium, actually have much heavier elements, will prove the more difficult to explain. Not impossible, but much fudgier. Probably have something to do with Inflation, since that’s the patch for the other factors that would otherwise require an infinite universe.
We have Dark Matter and Dark Energy, how about Dark Space?
At the very end: “Don’t but it.” ?? Don’t buy it??
Just wanted to add that physicists were convinced supersymmetry had to be right because that would supposedly have been natural… Now, they meant something slightly different by that expression than astrophysicists, but the origin of the problem is the same: They’re metaphysical requirements, not evidence-based. As you point out, the problem is that rather than having a hard look at what went wrong, particle physicists just try to forget about it. So this mistake is bound to repeat.
LikeLiked by 1 person
The scientific term is, “drinking their own bathwater.”
Indeed, there is naturalness in particle physics, which is not the same as a natural result in the way I describe. Words fail us. But like you, I’m more concerned that we don’t repeat our mistakes over and over.
This is not a reply to your remark about naturalness but a question about the predictions of LCDM and the ones of MOND here https://tritonstation.com/2022/01/03/what-jwst-will-see/. Do the first data allow to bring a beginning of answer?
Forgive this question coming from an ignorant person trying to satisfy his curiosity. Thank you for your attention, Norbert.
Yes, that post seems like a perfectly reasoned refutation of the closed-mindedness of the scientific academy until you realize that Loeb doesn’t hold himself to the standards he espouses. In his challenge to graduating students regarding the cold dark matter paradigm, he carefully separates the “cold” part from the underlying “dark matter” paradigm. It’s not hard to find evidence that the Professor does not consider that paradigm up for reconsideration, embarrassing as the lack of empirical evidence may be:
View at Medium.com
Mathematicism comes to the rescue, of course. No need for empirical evidence when the Truth is in the math.
LikeLiked by 1 person
Epicycles were brilliant math, as a description of our view of the cosmos. The crystalline spheres were lousy physics, as explanation.
Map versus territory.
LikeLiked by 1 person
The flip side is that unpopular-but-testable theories don’t even get tested. I’m working on a class of theories whose first easily-testable prediction was made in 1978. I proposed an experiment to test it this year, but couldn’t get beam time. The beam time went instead to an experiment to measure the electric dipole moment of the muon (as far as we know, muons are point particles and can’t possibly have an electric dipole) and an experiment to measure whether muonium falls upwards (hadronic antimatter doesn’t, so why would we expect leptonic antimatter to be any different?); almost certainly neither of them will find anything, but they’ll generate some papers to help students get jobs and tenure. The last theory paper in my field took 21 years to get published. Arxiv.org (the members-only swimming pool at the physics country club) has been blocking me from even posting preprints for over a decade. And we wonder why physics seems so constipated …
A tough call. On one hand, there are only so many physicists with so many hours in the day, and there simply aren’t resources to test every proposal. On the other hand, non-scientific priors about a particular proposal are given too much weight.
The statement that electrons are point-like is everywhere, but it is risky.
In essence, this is calculated. The calculation has an inaccuracy.
If one assumes a point-like electron, then:
f(x) = delta(x)
this is compared with the experiment and agrees well.
But if one starts from the experiment, one does not get from F^2(q) to F(q). The square root is not unique.
Thus one has the following result:
A point-like electron is sufficient for the experiments, but not necessary.
String theorists have a saying, “string theory is smarter than you” – by which they mean, foremost, themselves. There’s nothing in string theory that inherently requires one to ignore MOND in favor of WIMPs, so perhaps that’s another example.
Group theorists have a saying – “groups are clever” – which means much the same thing. The reason theories of particle physics are so complicated, and still don’t do the job, is because particle physicists do not understand the cleverness of groups. The isometry group of Minkowski spacetime is incredibly clever, and has four disconnected pieces that describe four completely different kinds of particles – so what do particle physicists do first? They delete three-quarters of the group and replace it by a Lie algebra. No wonder they don’t understand why the weak force is chiral.
String theory is beautiful, unifying all forces and elementary particles. And supersymmetry is a nice idea, but it should be fit with the data (standard model).
I quite agree. The lack of physical reality in culture and thinking is depressing and terrifying. Is this the best we can do? Shameful. It helps to keep in mind that ‘intelligent’ and ‘educated’ are orthogonal axes. Some very highly educated people are also very stupid. SUSY has become an almost canonical example.
LikeLiked by 1 person
This post really resonated with me, as for some time I’ve been engrossed in the study of the Aharonov-Bohm (A-B) effect, reading numerous papers over a number of weeks. For someone like myself with probably an undergrad level of physics knowledge it’s rather mind-bending. It’s named after two researchers – Yakir Aharonov and David Bohm, who predicted the effect in 1959 (though it was predicted a decade earlier by two other researchers). In the idealized experiment electrons, in phase, traverse either side of an (infinitely) long solenoid, through which a current is flowing, rejoining at a slit on the other side. The “magnetic vector potential” from the enclosed magnetic field within the solenoid induces a phase shift between electrons passing either side of the solenoid, changing the interference pattern at a screen placed beyond the slit.
What’s weird is the magnetic field inside the solenoid is fully confined, yet (if I understand it correctly) its “potential” extends into the region where the electrons travel. From everything I’ve read this is firmly established physics (observed in laboratory experiments) and is intimately connected to gauge invariance in physical theories. To quote from “Demystifyer” (post #13) on the thread “Interpretations of the Aharonov-Bohm effect” at Physics Forums: – “The gauge invariance is not a property of nature, it is a property of one convenient mathematical representation of the properties of nature.” Interestingly, looking at diagrams of the ‘ideal A-B experiment’ the direction of the magnetic vector potential corresponds to the “right hand rule of thumb”, where the thumb points in the direction of the magnetic field, while the curl of the fingers point in the direction electrons would orbit around those magnetic field lines. So it’s as if the magnetic field interior to the solenoid acts like it is also outside, where the electrons travel, despite not actually being there. This phenomena, from what I’ve read, is essentially a quantum mechanical effect where “potentials” are central; the fields being “derivatives” of these potentials.
My original motivation for studying the A-B effect was to see if it might be alternatively explained in a toy, rather amateur, model that I’ve been working on, that among other things, posits a physical mechanism underlying MONDian behavior at astronomical scales. I’ve even entertained ‘dark’ thoughts, wondering if the ‘potential’ approach in physics might be incorrect, just as I believe dark matter is an illusion. But, as soon as I think that I come to my senses realizing that the highest IQ people on the planet have worked these things out over generations in exquisite mathematical detail. Nonetheless, I’ll still try to see if my toy model can provide a more physical explanation for this very abstract and counterintuitive effect.
“… the highest IQ people on the planet have worked these things out over generations in exquisite mathematical detail.”
The problem is that math is not physics and physics is not math. A lot of those high IQ people don’t understand that critical distinction, nor do they understand physics on its own terms, unmediated by a mathematical model.
Yes, I agree that conflating math and physics can be problematic. All attempts at extending the Standard Model from the mathematical symmetry groups in which it is currently framed: SU(3) X SU(2) X U(1) have floundered. In fact I was just browsing a thread at Physics Forums titled: “Three Generations of Fermions from octonions Clifford algebras”. Randomly picking one of the papers cited in the thread: “Three Fermion generations with two unbroken gauge symmetries from the complex sedenions”, and scanning it, I was immensely impressed by the intellectual effort that goes into these papers. To be honest I won’t even pretend to say I understand it, beyond a rudimentary level. But as I scrolled through the pages I couldn’t help but reflect on a simple idea I conceived, back in the mid-90’s, that provides a straightforward physical explanation for the existence of three particle generations, no more, no less, which can be described in a few sentences.
Frustratingly, the last 40-50 years have seen a parade of very smart people whose highly elaborate mathematical models have attempted to shoehorn the Standard Model into a grand, unified structure that accounts for spin, charge, number of generations, etc. They have burst onto the scene like supernova, only to fade into the background when inherent problems are found with them, or they generate an infinite landscape of solutions as with the strings paradigm. To be sure, math is essential as an interface between laboratory data and theory, particularly in quantum mechanics/particle physics. Predictions derived from the math in these fields have been verified to many decimal places. Interpreting what the math means in a physical sense has preoccupied generations of physicists. I have my own ideas on this, like lots of armchair physics enthusiasts. The paper articulating these ideas is in temporary abeyance, as I wrestle with incorporating the A-B effect. And, with great summer weather I’ve been distracted with non-physics activities like riding with our local cycling club. Just in the last few rides we covered 75 miles with about 3700 feet of vertical climbing. That’s nothing for a young person, but I’m anything but young.
Wyrd, you quite agree, good. But I will not follow you in shaming the string theorists. As a mathematician, I appreciate the math they produced and researched regardless of practical applications. The wrong turn was not to develop the theory regardless of reality, the wrong turn was to predict SUSY particles saying “they MUST exist” (and ignore how MOND with much evidence removes the need for dark matter susy particles). I have faith that the SUSY theories and M-theory do have many theoretical applications.
LikeLiked by 1 person
Meanwhile, I see the string theorists like the charlatans of alchemy,
who wanted to turn lead into gold.
These did very mysterious and impressive, could not predict anything
or present a result – meanwhile for 30 years.
Whenever I don’t like someone
I advise them to look into string theory.
“Only the smartest of the smart deal with string theory.”
They are then pleased.
What puzzles me most is why anyone would be willing to bother with string theory.
String theory cannot explain the double-slit experiment with its small strings,
so it is not capable of anything special. End of discussion.
The assumption that an object is a particle and a wave is a contradiction.
To call this contradiction, dualism is a crime against students.
In the past, there was only Schrödinger’s cat to show that there is something wrong.
Today, articles appear on Wigner’s Friend. Nonsense in the 4th human generation ?!
Here and there, quantum mechanics delivers impressive results. That is nice for all engineers.
But it starts with a contradiction. This is a fiasco for all scientists.
It is unclear why so many bow so deeply before quantum mechanics,
a theory that starts with a contradiction…
Put a particle into an environment, and it creates waves. You can put a wave into an environment, but it doesn’t create particles. That tells any clear-thinking person which way you have to go to take the contradiction out of quantum mechanics. String theory does the opposite, and tries to create particles out of waves. Unfortunately, this cannot be done.
Just a mention that Indranil Banik has an article on MOND in “The Conversation”, which publishes articles for the general reader written by academics: https://theconversation.com/dark-matter-our-review-suggests-its-time-to-ditch-it-in-favour-of-a-new-theory-of-gravity-186344
If there would be only Newton, one would need nothing necessarily to explain.
But the existence of Newton and MOND requires an explanation.
a0 requires an explanation.
Newton and F ~ GM/r^2 requires an explanation,
compared to MOND with F ~ sqrt(GM)/r.
Who wants can think about it
and has a tidy head start before the dark matter engineers.
If you read his disclosure note on the article it says: “Indranil Banik is paid from a grant awarded by the STFC whose primary objective is to test MOND using wide binary stars in the Solar neighbourhood.”
This is a test on a distance scale where dark matter should have no effect. Dark matter reminds me of the search for the planet Vulcan (the 19th century equivalent of dark matter) to explain the extra precession of Mercury’s orbit.
New ideas have always had a hard time.
But the mainstream is also right.
Just because an idea is new doesn’t mean it has to be good.
The mathematical education of theoretical physicists is not as good as it should be.
The worst example is the following:
Mathematicians start their theories with axioms.
An axiom is the simplest of the simple.
And because it is already the simplest, it does not need to be and cannot be proved/explained.
Axiom has a well understandable and strict meaning.
Meanwhile I know at least one theoretical physicist who does the following in all seriousness:
1. take the most complex thought I have ever heard of
2. call this thought an axiom.
3. and suddenly you don’t need to explain this complex thought anymore.
Goggelmoggel/Humpty Dumpty sends his regards…
Mathematics and the model as the main problem
The main problem is mathematics, the differential calculus.
Differential calculus requires a continuous space
and deals with the motion of point masses.
That is what it was invented for. That is what it is suitable for.
Planets are excellent point masses.
And the space around the sun is continuous.
One can theoretically place a planet at any point in the solar system.
Individual atoms are not. Single atoms are not points.
If they were points and so small, as it is to be read everywhere,
we would have different results in the double-slit experiment.
We can neither measure a size nor an expansion for single atoms,
nor trace their orbits.
But if we cannot measure such things, then the space in the 10nm
region must be neither continuous nor homogeneous.
But this means that the above conditions for the use of the differential
calculus are no longer fulfilled.
Why do physicists then use the differential calculus?
Because they have nothing else – would be the truthful answer…
Who doesn’t want to follow this example, here is another one:
An electron has the following properties: charge, mass, spin=1/2, velocity v < c
An electron is a stable elementary particle. It does not decay.
But with a positron, everything becomes light, 2 or 3 photons.
Now you have: no charge, no mass, spin=1 and v=c.
An electron is a stable object in a normal environment.
But together with a positron, it is transformed into light.
Thereby all known properties change.
With planetary mathematics, one will never understand this.
I have a good friend.
He runs the scanning tunneling microscopy group in Göttingen.
Some months or years ago he told me in all seriousness:
“Physics has not the task to explain something,
but only the task to describe things.”
Why do physicists no longer have “why” questions in their heads?
Is there a good reason why gas and stars form disk-shaped galaxies?
There are very many disk-shaped galaxies. Stars and gas will have a reason
to form spiral galaxies.
everything is overlaid by the dark matter controversy.
If one leaves all this away, what remains as reason to form disk-shaped galaxies?!
Early galaxies are elliptical. Late galaxies are spirals.
What changed between “early” and “late”?
Stacy, do you have an idea?
“Early” and “Late” when applied to galaxies, are not now thought to have have anything to do with time. It was Hubble’s original (mistaken) idea that galaxies evolved from ellipticals into spirals.
http://icc.dur.ac.uk/~tt/Lectures/Galaxies/TeX/lec/node8.html is a simple explanation of the properties of elliptical and spiral galaxies
Comments are closed.