Structure Formation Mythology

Do not be too proud of this technological terror you’ve constructed. The ability to simulate the formation of large scale structure is insignificant next to the power of the Force.

– Darth Vader, Lord of the Sith

The now standard cosmology, ΛCDM, has a well developed cosmogony that provides a satisfactory explanation of the formation of large scale structure in the universe. It provides a good fit to both the galaxy power spectrum at low redshift and that of the cosmic microwave background (CMB) at z=1080. This has led to a common misconception among cosmologists that this is only way it can be.

The problem is this: the early universe was essentially homogeneous, while the current universe is not. At the time of recombination, one patch of plasma had the same temperature and density as the next patch to 1 part in 100,000. Look around at the universe now, and you see something very different: galaxies strung along a vast web characterized chiefly by empty space and enormous voids. Trouble is, you can’t get here from there.

Gravity will form structure, making the over-dense patches grow ever denser, in a classic case of the rich getting richer. But gravity is extraordinarily weak. There simply is not enough time in the ~13 Gyr age of the universe for it to make the tiny density variation observed in the CMB into the rich amount of structure observed today.

We need something to goose the process. This is where non-baryonic cold dark matter (CDM) comes in. It outweighs the normal matter, and does not interact with the photons of the CMB. This latter part is critical, as the baryons are strongly coupled to the photons, which don’t let them clump up enough early on. The CDM can. So it starts to form structure early which the baryons subsequently trace. Since structure formed, CDM must exist.

This is a sound line of reasoning. It convinced many of us, including myself, that there had to be some form of non-baryonic mass made of some particle outside the standard model of particle physics. The other key fact was that the gravitating mass density was inferred to outweigh the amount of baryons indicated by Big Bang Nucleosynthesis (Ωm ≫ Ωb).

Does anyone spot the problem with this line of thinking?

It took me a long time to realize what it was. Both the structure formation argument and the apparent fact that Ωm ≫ Ωb implicitly assume that gravity is normal. All we need to know to approach either problem is what Newton and Einstein taught us. Once we make that assumption, we are absolutely locked into the line of reasoning that leads us to CDM.

I worry that CDM is a modern æther. Given our present understanding of physics, it has to exist. In the nineteenth century, so too did æther. Had to. Only problem was, it didn’t.

If, for a moment, we let go of our implicit assumption, then we may realize that what structure formation needs is an extra push (or pull, to make overdensities collapse faster). That extra push may come from CDM, or it may come from an increase in the strength of the effective force law. Rather than being absolute proof of the existence of CDM, the rapid formation of structure might also be another indication that we need to tweak for force law.

I have previously outlined how structure might form in a modified force law like MOND. Early efforts do not provide as good a fit to the power spectrum as ΛCDM. But they provide a much better approximation than did the predecessor of ΛCDM, SCDM.

Indeed, there have been some striking predictive successes. As we probe to ever higher redshift, we see time and again more structure than had been anticipated by ΛCDM. Galaxies form early in MOND, so this is quite natural. So too does the cosmic web, which I predict to be more developed in MOND at redshifts of 3 and even 5. By low redshift, MOND does a much better job of emptying out the voids than does ΛCDM. Ultimately, I expect we may get a test from 21 cm reverberation mapping in the dark ages, where I predict we may find evidence of strong baryonic oscillations. (These predictions were made, and published in refereed journals, in the previous millennium.)

I would not claim that MOND provides a satisfactory description of large scale structure. The subject requires a lot more work.  Structure formation in MOND is highly non-linear. It is a tougher problem than standard perturbation theory. Yet we have lavished tens of thousands of person-years of effort on ΛCDM, and virtually no effort on the harder problem in the case of MOND. Having failed to make an effort does not suffice as evidence.

And then there were six

With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.

– John von Neumann

The simple and elegant cosmology encapsulated by the search for two numbers has been replaced by ΛCDM. This is neither simple nor elegant. In addition to the Hubble constant and density parameter, we now also require distinct density parameters for baryonic mass, non-baryonic cold dark matter, and dark energy. There is an implicit (seventh) parameter for the density of neutrinos.

Now we also include the power spectrum as cosmological parameters (σ8, n). These did not use to be considered on the same level as the Big Two. They aren’t: they concern structure formation within the world model, not the nature of the world model. But I guess they seem more important once the Big Numbers are settled.

Here is a quick list of what we believed, then and now:

 

Paramater SCDM ΛCDM
H0 50 70
Ωm 1.0 0.3
Ωbh2 0.0125 0.02225
ΩΛ 0.7
σ8 0.5 0.8
n 1.0 0.96

 

There are a number of “lesser” parameters, like the optical depth to reionization. Plus, the index n can run, one can invoke scale dependent non-linear biasing (a rolling fudge factor for σ8), and people talk seriously about the time evolution of antigravity the dark energy equation of state.

From the late ’80s to the early ’00s, all of these parameters (excepting only n) changed by much more than their formal uncertainty or theoretical expectation. Even big bang nucleosynthesis – by far the most robustly constrained – suffered a doubling in the mass density of baryons. This should be embarrassing, but most cosmologists assert it as a great success while quietly sweeping the lithium problem under the carpet.

The only thing that hasn’t really changed is our belief in Cold Dark Matter. That’s not because it is more robust. It is because it is much harder to detect, let alone measure.

Two Numbers

Cosmology used to be called the hunt for two numbers. It was simple and elegant. Nowadays we need at least six. It is neither simple nor elegant. So how did we get here?

The two Big Numbers are, or at least up till the early-90s were, the Hubble constant H0 and the density parameter Ω. These told us Everything. Or so we thought.

The Hubble constant is the expansion rate of the universe. Not only does it tell us how fast the universe is expanding, it sets the size scale through the Hubble distance-velocity relation. Moreover, its inverse is the Hubble time – essentially the age of the universe. A Useful and Important Number. To seek to measure it was a noble endeavor into which much toil and treasure was invested. Getting this right was what the Hubble Space Telescope was built for.

The density parameter measures the amount of stuff in the universe. Until relatively recently, it was used exclusively to refer to the mass density – the amount of gravitating stuff normalized to the critical density. The critical density is the over/under point where there is enough gravity to counteract the expansion of the universe. If Ω < 1, there isn’t enough, and the universe will expand forever. If Ω > 1, there’s more than enough, and the universe will eventually stop expanding and collapse. It controls the fate of the universe.

Just two numbers controlled the size, age, and ultimate fate of the universe. The hunt was on.

Of course, the hunt had been on for a long time, ever since Hubble discovered that the universe was expanding. For the first fifty years it largely shrank, then settled into a double valued rut between two entrenched camps. Sandage and collaborators found H0 = 50 km/s/Mpc while de Vaucoulers found a value closer to 100 km/s/Mpc.

The exact age of the universe depends a little on Ω as well as the Hubble constant. If the universe is empty, there is no gravity to retard its expansion. The age of such a `coasting’ universe is just the inverse of the Hubble constant – about 10 Gyr (10 billion years) for H0 = 100 and 20 Gyr for H0 = 50. If instead the universe has the critical density Ω = 1, the age is just 2/3 of the coasting value.

The difference in age between empty and critical ages is not huge by cosmic standards, but it nevertheless played an important role in guiding our thinking. Stellar evolution places a constraint on the ages of the oldest stars. These are all around a Hubble time old. That’s good – it looks like the first stars formed near the beginning of the universe. But we can’t have stars that are older than the universe they live in.

In the 80s, a commonly quoted age for the oldest stars was about 18 Gyr. That’s too old for de Vaucoulers’s H0 = 100 – even if the universe is completely empty. Worse, Ω = 1 is the only natural scale in cosmology; it seemed to many like the most likely case – a case bolstered by the advent of Inflation. In that case, the universe could be at most 13 Gyr old, even adopting Sandage’s H0 = 50. It was easy to imagine that the ages of the oldest stars were off by that much (indeed, the modern number is closer to 12 Gyr) but not by a lot more: Ages < 10 Gyr with H0 = 100 were right out.

Hence we fell into a double trap. First, there was confirmation bias: the ages of stars led to a clear preference for who must be right about the Hubble constant. Then Inflation made a compelling (but entirely theoretical) case the Ω had to be exactly 1 – entirely in mass. (There was no cosmological constant in those days.  You were stupid to even consider that.) This put further pressure on the age problem. A paradigm emerged with Ω = 1 and H0 = 50.

There was a very strong current of opinion in the 80s that this had to be the case. Inflation demanded Ω = 1, in which case H0 = 50 was the only sensible possibility. You were stupid to think otherwise.

That was the attitude into which I was indoctrinated. I wouldn’t blame any particular person for this indoctrination; it was more of a communal group-think. But that is absolutely the attitude that reigned supreme in the physics departments of MIT and Princeton in the mid-80s.

I switched grad schools, having decided I wanted data. Actual observational data; hands on telescopes. When I arrived at the University of Michigan in 1987, I found a very different culture among the astronomers there. It was more open minded. Based on measurements that were current at the time, H0 was maybe 80 or so.

At first I rejected this heresy as obviously insane. But the approach was much more empirical. It would be wrong to say that it was uninformed by theoretical considerations. But it was also informed by a long tradition of things that must be so turning out to be just plain wrong.

Between 1987 and 1995, the value of the Big Numbers changed by amounts that were inconceivable. None of the things that must be so turned out to be correct. And yet now, two decades later, we are back to the new old status quo, where all the parameters are Known and Cannot Conceivably Change.

Feels like I’ve been here before.

Rethinking the Dark Matter Paradigm

I travel to Cambridge, MA tomorrow to participate in the workshop Rethinking the Dark Matter Paradigm (I had nothing to do with the choice of title). I went to college at MIT in the ’80s, so is a bit back to the future for me in space as well as time. There is a lot to rethink, or nothing at all, depending on who you ask. I’m curious to see if any of us are willing to think beyond I was right all along!

One of the compelling notions that emerged in the ’80s was non-baryonic dark matter. Baryons are the massive particles (protons & neutrons) of which normal stuff is made. It was well established by that time that the light elements were produced in the early universe by Big Bang Nucleosynthesis (BBN). It became clear in the ’80s that the mass density of normal stuff produced by BBN did not add up to the mass we needed to explain a whole host of astronomical observations, in both cosmology and galaxy dynamics. In short, Einstein’s General Relativity plus the baryons we could see did not suffice to explain the universe.

There were two obvious paths forward. Modify Einstein’s theory, or invoke unseen non-baryonic matter. The latter course seems by far the more plausible. No one had a compelling reason to challenge Einstein’s highly successful theory. On the other hand, there were plenty of reasons in particle physics to imagine new particles outside the standard model, particularly in the hypothesized supersymmetric sector.

It was quickly realized that large scale structure would only grow if this new stuff were composed of slow moving, non-relativistic particles – a condition summarized as dynamically “cold.” Hence Cold Dark Matter (CDM) was born. Weakly Interacting Massive Particles (WIMPs) from supersymmetry were a good candidate to be the CDM.

Thus began the marriage of astronomy and particle physics, two fields divided by a common interest in dark matter and cosmology. The heated embrace of the honeymoon has long since worn off, to the point that some of us are ready to rethink the whole paradigm.

This is no small step. Though I’ve come to doubt the existence of CDM, I still feel very comfortable with it.  First love, and all. More importantly, it has been the one essential item in cosmology that has remained unchanged through the turbulent ’90s and on to today. But that is a longer story that will take many posts to tell.

For now, we’ll go see how much rethinking we’re willing to do.