There is a tendency when teaching science to oversimplify its history for the sake of getting on with the science. How it came to be isn’t necessary to learn it. But to do science requires a proper understanding of the process by which it came to be.
The story taught to cosmology students seems to have become: we didn’t believe in the cosmological constant (Λ), then in 1998 the Type Ia supernovae (SN) monitoring campaigns detected accelerated expansion, then all of a sudden we did believe in Λ. The actual history was, of course, rather more involved – to the point where this oversimplification verges on disingenuous. There were many observational indications of Λ that were essential in paving the way.
Modern cosmology starts in the early 20th century with the recognition that the universe should be expanding or contracting – a theoretical inevitability of General Relativity that Einstein initially tried to dodge by inventing the cosmological constant – and is expanding in fact, as observationally established by Hubble and Slipher and many others since. The Big Bang was largely considered settled truth after the discovery of the existence of the cosmic microwave background (CMB) in 1964.
The CMB held a puzzle, as it quickly was shown to be too smooth. The early universe was both isotropic and homogeneous. Too homogeneous. We couldn’t detect the density variations that could grow into galaxies and other immense structures. Though such density variations are now well measured as temperature fluctuations that are statistically well described by the acoustic power spectrum, the starting point was that these fluctuations were a disappointing no-show. We should have been able to see them much sooner, unless something really weird was going on…
That something weird was non-baryonic cold dark matter (CDM). For structure to grow, it needed the helping hand of the gravity of some unseen substance. Normal matter matter did not suffice. The most elegant cosmology, the Einstein-de Sitter universe, had a mass density Ωm= 1. But the measured abundances of the light elements were only consistent with the calculations of big bang nucleosynthesis if normal matter amounted to only 5% of Ωm = 1. This, plus the need to grow structure, led to the weird but seemingly unavoidable inference that the universe must be full of invisible dark matter. This dark matter needed to be some slow moving, massive particle that does not interact with light nor reside within the menagerie of particles present in the Standard Model of Particle Physics.
CDM and early universe Inflation were established in the 1980s. Inflation gave a mechanism that drove the mass density to exactly one (elegant!), and CDM gave us hope for enough mass to get to that value. Together, they gave us the Standard CDM (SCDM) paradigm with Ωm = 1.000 and H0 = 50 km/s/Mpc.
It is hard to overstate the ferver with which the SCDM paradigm was believed. Inflation required that the mass density be exactly one; Ωm < 1 was inconceivable. For an Einstein-de Sitter universe to be old enough to contain the oldest stars, the Hubble constant had to be the lower of the two (50 or 100) commonly discussed at that time. That meant that H0 > 50 was Right Out. We didn’t even discuss Λ. Λ was Unmentionable. Unclean.
SCDM was Known, Khaleesi.
Λ had attained unmentionable status in part because of its origin as Einstein’s greatest blunder, and in part through its association with the debunked Steady State model. But serious mention of it creeps back into the literature by 1990. The first time I personally heard Λ mentioned as a serious scientific possibility was by Yoshii at a conference in 1993. Yoshii based his argument on a classic cosmological test, N(m) – the number of galaxies as a function of how faint they appeared. The deeper you look, the more you see, in a way that depends on the intrinsic luminosity of galaxies, and how they fill space. Look deep enough, and you begin to trace the geometry of the cosmos.
At this time, one of the serious problems confronting the field was the faint blue galaxies problem. There were so many faint galaxies on the sky, it was incredibly difficult to explain them all. Yoshii made a simple argument. To get so many galaxies, we needed a big volume. The only way to do that in the context of the Robertson-Walker metric that describes the geometry of the universe is if we have a large cosmological constant, Λ. He was arguing for ΛCDM five years before the SN results.
Yoshii was shouted down. NO! Galaxies evolve! We don’t need no stinking Λ! In retrospect, Yoshii & Peterson (1995) looks like a good detection of Λ. Perhaps Yoshii & Peterson also deserve a Nobel prize?
Indeed, there were many hints that Λ (or at least low Ωm) was needed, e.g., the baryon catastrophe in clusters, the power spectrum of IRAS galaxies, the early appearance of bound structures, the statistics of gravitational lenses, and so on. Certainly by the mid-90s it was clear that we were not going to make it to Ωm = 1. Inflation was threatened – it requires Ωm = 1 – or at least a flat geometry: Ωm+ΩΛ = 1.
SCDM was in crisis.
A very influential 1995 paper by Ostriker & Steinhardt did a lot to launch ΛCDM. I was impressed by the breadth of data Ostriker & Steinhardt discussed, all of which demanded low Ωm. I thought the case for Λ was less compelling, as it hinged on the age problem in a way that might also have been solved, at that time, by simply having an open universe (low Ωm with no Λ). This would ruin Inflation, but I wasn’t bothered by that. I expect they were. Regardless, they definitely made that case for ΛCDM three years before the supernovae results. Their arguments were accepted by almost everyone who was paying attention, including myself. I heard Ostriker give a talk around this time during which he was asked “what cosmology are you assuming?” to which he replied “the right one.” Called the “concordance” cosmology by Ostriker & Steinhardt, ΛCDM had already achieved the status of most-favored cosmology by the mid-90s.
Ostriker & Steinhardt neglected to mention an important prediction of Λ: not only should the universe expand, but that expansion rate should accelerate! In 1995, that sounded completely absurd. People had looked for such an effect, and claimed not to see it. So I wrote a brief note pointing out the predicted acceleration of the expansion rate. I meant it in a bad way: how crazy would it be if the expansion of the universe was accelerating?! This was an obvious and inevitable consequence of ΛCDM that was largely being swept under the rug at that time.
I mean[t], surely we could live with Ωm < 1 but no Λ. Can’t we all just get along? Not really, as it turned out. I remember Mike Turner pushing the SN people very hard in Aspen in 1997 to Admit Λ. He had an obvious bias: as an Inflationary cosmologist, he had spent the previous decade castigating observers for repeatedly finding Ωm < 1. That’s too little mass, you fools! Inflation demands Ωm = 1.000! Look harder!
By 1997, Turner had, like many cosmologists, finally wrapped his head around the fact that we weren’t going to find enough mass for Ωm = 1. This was a huge problem for Inflation. The only possible solution, albeit an ugly one, was if Λ made up the difference. So there he was at Aspen, pressuring the people who observed supernova to Admit Λ. One, in particular, was Richard Ellis, a great and accomplished astronomer who had led the charge in shouting down Yoshii. They didn’t yet have enough data to Admit Λ. Not.Yet.
By 1998, there were many more high redshift SNIa. Enough to see Λ. This time, after the long series of results only partially described above, we were intellectually prepared to accept it – unlike in 1993. Had the SN experiments been conducted five years earlier, and obtained exactly the same result, they would not have been awarded the Nobel prize. They would instead have been dismissed as a trick of astrophysics: the universe evolves, metallicity was lower at earlier times, that made SN then different from now, they evolve and so cannot be used as standard candles. This sounds silly now, as we’ve figured out how to calibrate for intrinsic variations in the luminosities of Type Ia SN, but that is absolutely how we would have reacted in 1993, and no amount of improvements in the method would have convinced us. This is exactly what we did with faint galaxy counts: galaxies evolve; you can’t hope to understand that well enough to constrain cosmology. Do you ever hear them cited as evidence for Λ?
Great as the supernovae experiments to measure the metric genuinely were, they were not a discovery so much as a confirmation of what cosmologists had already decided to believe. There was no singular discovery that changed the way we all thought. There was a steady drip, drip, drip of results pointing towards Λ all through the ’90s – the age problem in which the oldest stars appeared to be older than the universe in which they reside, the early appearance of massive clusters and galaxies, the power spectrum of galaxies from redshift surveys that preceded Sloan, the statistics of gravitational lenses, and the repeated measurement of 1/4 < Ωm < 1/3 in a large variety of independent ways – just to name a few. By the mid-90’s, SCDM was dead. We just refused to bury it until we could accept ΛCDM as a replacement. That was what the Type Ia SN results really provided: a fresh and dramatic reason to accept the accelerated expansion that we’d already come to terms with privately but had kept hidden in the closet.
Note that the acoustic power spectrum of temperature fluctuations in the cosmic microwave background (as opposed to the mere existence of the highly uniform CMB) plays no role in this history. That’s because temperature fluctuations hadn’t yet been measured beyond their rudimentary detection by COBE. COBE demonstrated that temperature fluctuations did indeed exist (finally!) as they must, but precious little beyond that. Eventually, after the settling of much dust, COBE was recognized as one of many reasons why Ωm ≠ 1, but it was neither the most clear nor most convincing reason at that time. Now, in the 21st century, the acoustic power spectrum provides a great way to constrain what all the parameters of ΛCDM have to be, but it was a bit player in its development. The water there was carried by traditional observational cosmology using general purpose optical telescopes in a great variety of different ways, combined with a deep astrophysical understanding of how stars, galaxies, quasars and the whole menagerie of objects found in the sky work. All the vast knowledge incorporated in textbooks like those by Harrison, by Peebles, and by Peacock – knowledge that often seems to be lacking in scientists trained in the post-WMAP era.
Despite being a late arrival, the CMB power spectrum measured in 2000 by Boomerang and 2003 by WMAP did one important new thing to corroborate the ΛCDM picture. The supernovae data didn’t detect accelerated expansion so much as exclude the deceleration we had nominally expected. The data were also roughly consistent with a coasting universe (neither accelerating nor decelerating); the case for acceleration only became clear when we assumed that the geometry of the universe was flat (Ωm+ΩΛ = 1). That didn’t have to work out, so it was a great success of the paradigm when the location of the first peak of the power spectrum appeared in exactly the right place for a flat FLRW geometry.
The consistency of these data have given ΛCDM an air of invincibility among cosmologists. But a modern reconstruction of the Ostriker & Steinhardt diagram leaves zero room remaining – hence the tension between H0 = 73 measured directly and H0 = 67 from multiparameter CMB fits.
In cosmology, we are accustomed to having to find our way through apparently conflicting data. The difference between an expansion rate of 67 and 73 seems trivial given that the field was long riven – in living memory – by the dispute between 50 and 100. This gives rise to the expectation that the current difference is just a matter of some subtle systematic error somewhere. That may well be correct. But it is also conceivable that FLRW is inadequate to describe the universe, and we have been driven to the objectively bizarre parameters of ΛCDM because it happens to be the best approximation that can be obtained to what is really going on when we insist on approximating it with FLRW.
Though a logical possibility, that last sentence will likely drive many cosmologists to reach for their torches and pitchforks. Before killing the messenger, we should remember that we once endowed SCDM with the same absolute certainty we now attribute to ΛCDM. I was there, 3,000 internet years ago, when SCDM failed. There is nothing so sacred in ΛCDM that it can’t suffer the same fate, as has every single cosmology ever devised by humanity.
Today, we still lack definitive knowledge of either dark matter or dark energy. These add up to 95% of the mass-energy of the universe according to ΛCDM. These dark materials must exist.
It is Known, Khaleesi.
108 thoughts on “A personal recollection of how we learned to stop worrying and love the Lambda”
Indeed, 1/H0 is very close to the age of the universe. I do find it a curious coincidence that we are so close to the coasting limit. LCDM decelerates then accelerates, and will continue to accelerate forever into the future, but we happen to live just when these happen balance out.
Do you really believe it is a coincidence? Or is it a clue that there is something deeper than LCDM going on? After all, wouldn’t a huge value of H0 for very small t be just what is needed to explain the apparent inflationary expansion of the universe just after the Big Bang?
There is no reason to doubt the possibility of coincidence. Of course, it only APPEARS to be a coincidence, we might not understand it fully. In fact, we likely DON’T understand it fully.
We should not just declare it a “coincidence” and move on, but at the end of the day, it could well be. Why not?
I agree entirely. Coincidences happen, and denying the possibility of coincidence leads to conspiracy theories.That’s why I asked the question. Is it a coincidence, or is it not? I don’t think we have a definitive answer, and I retain an open mind.
To some extent, what we observe is dictated by the anthropic cosmological principle.We know that the Sun is about 4.6 billion years old and the oldest stars are just over 13 billion years old. Those stars have very low metallicity and it takes many generations of supernovae to seed the interstellar gas with enough metals to form stars with levels of metallicity like the Sun (and hence reasonable-sized rocky planets). If Ωm were =1, then running the expansion of the universe backwards would give a Big Bang date about 9 billion years ago (2/3 of the age for the coasting value). So just us being here and able to observe stars as old as 13 billion years tells us that Ωm << 1. If you look back at the Sandage/du Vaucouleurs argument over H0, part of Sandage's justification for 50 km/s/Mpc was to allow the universe to be old enough to contain the oldest stars. Stacy's blog post from three years ago, covers this well:
This, of course, does not account from the apparent flatness of the universe, but it illustrates that some coincidences may be more likely than may at first appear.
But then the whole expansion thing could just be an artefact of the model, no?
LikeLiked by 1 person
Who knows? On Mondays, Wednesdays and Fridays I believe in the Big Bang, and on Tuesdays, Thursdays and Saturdays I believe in a steady-state universe. The whole week, I know I’ve got it wrong. But a lot of so-called `facts’ are certainly artefacts of models. My expertise is not in analysing the data, but in analysing the algebra and the logic of the theoretical models. Every model I have analysed so far is self-contradictory, so I don’t see how it is possible to trust any of their predictions in any circumstances. The fact that some models do produce some good predictions quite a lot of the time is something I put down partly to coincidence(!), and partly to the fact that no-one would consider a model that wasn’t at least consistent with most of the things that have already been seen (or would they?!).
Nice account of the history of these concepts. I’m no astrophysicist and I struggled to understand some of this, but it at least gave me insights into the science works.
I see that Pesce et al have published new H0 measurements (ApJL 891 L1) based on a reanalysis of megamasers that give H0=73.9 ± 3.0 km/s/Mpc (preprint https://arxiv.org/abs/2001.09213) What I particularly like about these megamaser measurements is the lack of dependence on the distance ladder. Also the agreement between these and the SN1a and quasar lensing measurements of H0 suggests that there are no significant systematic errors in the distance ladder.
LikeLiked by 1 person
Yep, saw that. Hope to have something to add to it ourselves soon.
LikeLiked by 1 person
As an outsider to cosmology field, I am afraid to tell the bad news as I have spent my spare time in retirement defending and arguing for the GOOD news in over 1k blogs of max 1k bytes – until accidentally finding Triton Station of common sense discussions. So here I go:
1) Bad news for cosmologists (but good news for other applications of modern physics foundations): ALL cosmology theories have been dead wrong after 1995 break-through of Tuomo Suntola’s structural systems concept of Dynamic Universe (DU).
2) Good news: Reason for the death of past cosmology in DU was killing the traditional theories of relativity and quantum mechanics in the global or cosmic scale and dimensions. This required removal of c=constant postulate replacing it with C4 speed of contraction or expansion coupled to the 4th METRIC dimension R4 of Riemann 4-sphere – the starting point of GR. Einstein’s biggest mistake was NOT Lambda -but ‘time’ in the space-time frame definition. And perhaps even bigger mistake of basing the field equations on forces with inverse square distance law vs. Suntola’s integrated forces or 0-balance motion and gravitational energies with inverse distance law.
DU basic energy balance equation of closed 3-surface of 4-sphere in starting point of homogeneous mass M distribution couples R4 to C4 along the barycenter direction of R4 with absolute scalar time T4 and simultaneity. A fifth grader can derive and solve C4 as f(1/sqrt(R4)) for continual balancing of constant total mass M in 3-D space volume as f(R4^3). Thus deceleration of C4 by factor =2 expands R4 by factor k^2=4 and T4 by k^3=8. Thus, deceleration speed C4 (=c within 1 ppm) has been 2 times present value at z=3 or R4=13.8/4=3.45 B ly and T4=2/3 13.8/8= 1.15 B years since T4=0.
The nonlinear DU Hubble flow F4 is closed form nonlinear function F4=1/T4 at decreased ticking rate of atomic clock that fooled Einstein and especially all cosmology attempts based on GR: The TRUE value C4 (and c in space) is decelerating today by -35.5 km/s/Mpc=-1.15 10^-18/s. BUT the frequency of atomic clock F4 is ALSO slowing – making the APPARENT or locally observable c in space = constant!
The traditional Hubble constant/flow is typically coupled to the EXPANSION DISTANCE dR4/R4=71 km/s/Mpc = 2.3 10^-18/s at small z-shift values of Mpc. Even the sign of H0 is confused in cosmology claiming large values today at z=0 to indicate ACCELERATION although C4 physical model in DU shows that dC4/C4=-1/2 dR4/R4! My past blogs at various physics sites has detailed the derivations of H0 and its effective expansion step H0_eff=1/2H0 that also explains some basic mistakes in theory and observations of LIGO/Virgo Gravitational Waves.
Comments are closed.