Casertano et al. have used Gaia to provide a small but important update in the debate over the value of the Hubble Constant. The ESA Gaia mission is measuring parallaxes for billions of stars. This is fundamental data that will advance astronomy in many ways, no doubt settling long standing problems but also raising new ones – or complicating existing ones.

Traditional measurements of the H0 are built on the distance scale ladder, in which distances to nearby objects are used to bootstrap outwards to more distant ones. This works, but is also an invitation to the propagation of error. A mistake in the first step affects all others. This is a long-standing problem that informs the assumption that the tension between H0 = 67 km/s/Mpc from Planck and H0 = 73 km/s/Mpc from local measurements will be resolved by some systematic error – presumably in the calibration of the distance ladder.

Well, not so far. Gaia has now measured enough Cepheids in our own Milky Way to test the calibration used to measure the distances of external galaxies via Cepheids. This was one of the shaky steps where things seemed most likely to go off. But no – the scales are consistent at the 0.3% level. For now, direct measurement of the expansion rate remains H0 = 73 km/s/Mpc.

13 thoughts on “Cepheids & Gaia: No Systematic in the Hubble Constant

  1. So, what would happen to the Standard Model of Cosmology if one assumed that the Planck measure is wrong and we used 73 km/s/Mpc as the value for H0? What problem(s) would this cause?

    Like

  2. In thinking about the Hubble Constant, I started thinking about the measurement of redshift and what it actually represents.

    It appears to consist of two elements:
    1. The redshift of light measures the integrated metric expansion of space as light travels from source to earth based detector. This is the continuous expansion of the “metric frame” in which light moves, gradually stretching its wavelength over time. As far as I understand it, most of the observed redshift arises from this cause.
    2. There are also small redshift or blue-shift changes that ride on the main red-shift signal, due to something like earth bound “Doppler shifts”; this must be true otherwise we would not see galactic rotation curves and Doppler shifts in the Cosmic background radiation due to the earths rotation around the sun, plus the sun’s rotation around the galaxy etc.

    Despite needing General Relativity to account for cause 1, it is cause 2 that causes most confusion for me and perhaps others. Conventional Doppler shifts just require a relative velocity difference between the light source and the detector. The speed of light itself does not change, just the wavelength and frequency of light change in inverse proportion.

    However a “Doppler shift” across expanding space is not so simple. If you needed to include the relative velocity between source and detector at the time when a light signal was emitted (something we can’t measure directly), plus the metric expansion redshift due to cause 1, it seems to me that you would always end up with the same redshift measurement for all stellar objects in the sky. Clearly wrong.

    Therefore it seems that the already existing difference in relative velocity between two simultaneous and distant “cosmological frames of rest” (e.g. our local CMBR rest frame) when light is emitted from a source does not form part of the eventual “Doppler shift” element of the redshift measurement.

    This may mean that acceleration is the important explanatory attribute of cause 2, not relative velocity, as with a conventional Doppler shift within the same local cosmological frame of rest.

    On a related point. Researchers at University of Oxford have begun to question the accelerated expansion of the universe
    http://www.ox.ac.uk/news/science-blog/universe-expanding-accelerating-rate-%E2%80%93-or-it

    In defence of the accelerated expansion I have found this paper on the pre-print archive

    Click to access 1611.00999.pdf

    See Figure 2 of this paper – scale factor vs. cosmological time (t=1 be the present time) – which is supposedly conclusive proof for an accelerated universe.

    However I think that in the early universe we may be measuring the titanic struggle between the uniform expansion of space and the accelerated gravitational collapse of material on various distance and time scales, with the expansion of space eventually winning out. If this is true then the “noise” in the data will only get larger the further back in time we view (still with no data points under the straight line “Milne Curve” given in figure 2.) In that case the CMBR is effectively an early data point in figure 2 with redshift “noise” distributed over all distance scales. This conclusion is conjectural and incompatible with the LCDM curve in figure 2 and the associated standard model of cosmology interpretation.

    Like

  3. Needless to say I made some silly obvious mistakes in my comments above.
    Rather than try again and make more mistakes – If you are sceptical about Dark Energy and worried about the tension in the various Hubble constant determinations, a relevant paper is

    Click to access 1607.08797.pdf

    “Concordance cosmology without dark energy”

    I think it links up with what I was unsuccessfully trying to say.

    Like

  4. Hi, There is now a very simple way to calculate Hubble’s Constant, by inputting to an equation, the numerical value of Pi and the speed of light (C) from Maxwell’s equations. NO space probe measurements (with their inevitable small measuring / interpretation errors) are now required. Hubble’s Constant is ‘fixed’ at 70.98047 PRECISELY. This maths method removes the errors / tolerances that is always a part of attempting to measuring something as ‘elusive’ as Hubble’s Constant.
    The equation to perform this can be found in ‘The Principle of Astrogeometry’ on Amazon Kindle Books, David.

    Like

  5. There is a new paper in Nature Astronomy “Cosmological constraints from the Hubble diagram of quasars at high redshifts” by G. Risaliti & E. Lusso (also at https://arxiv.org/abs/1811.02590). The abstract reads:

    “The concordance model (Λ cold dark matter (ΛCDM) model, where Λ is the cosmological constant) reproduces the main current cosmological observations assuming the validity of general relativity at all scales and epochs and the presence of CDM and of Λ, equivalent to dark energy with a constant density in space and time. However, the ΛCDM model is poorly tested in the redshift interval between the farthest observed type Ia supernovae and the cosmic microwave background. We present measurements of the expansion rate of the Universe based on a Hubble diagram of quasars. Quasars are the most luminous persistent sources in the Universe, observed up to redshifts of z ≈ 7.5. We estimate their distances following a method developed by our group, based on the X-ray and ultraviolet emission of the quasars. The distance modulus/redshift relation of quasars at z < 1.4 is in agreement with that of supernovae and with the concordance model. However, a deviation from the ΛCDM model emerges at higher redshift, with a statistical significance of ~4σ. If an evolution of the dark energy equation of state is allowed, the data suggest dark energy density increasing with time."

    So it looks like more problems for the ΛCDM model.

    Like

  6. Stacy, yes I agree. I was trying to decide which of your postings it was most appropriate to put this comment under as it is related to using quasars as standard candles.

    Like

  7. You need to consider 2 X a mega parsec X C, divided by Pi to the power of 21, as described in ‘The Principle of
    Astrogeometry’, which shows Hubble’s Constant is 70.98047 kilometres per sec per megaparsec.
    Many initially make the basic error of not using the correct ratio value of a parsec, which is 3.26 light years. C is in kilometres per sec. This Hubble equation has been tested by several mathematicians, and needs to now be included in the Wikipedia chart as the Hubble precise reference value. Without this, the Hubble Constant cannot ever be pinned to its exact value by measuring, which always produces differing results, depending on the measuring method used. This will never be settled without the Astrogeometry equation. With kind regards, David Hine

    Like

  8. An interesting recent development is that a long term project to estimate the Hubble constant via gravitational lensing recently reported H0 = 76.8 +/- 2.6 km/s/Mpc: https://arxiv.org/abs/1907.02533. That is consistent with other attempts to directly measure the expansion rate of the universe, but in considerable tension with the best multiparameter fit to the CMB a observed by Planck.

    Like

      1. This is an interesting story. First, let me reiterate the danger of confirmation bias. There is a large segment of the cosmology community that wants to believe Planck above all else. Therefore, I expected some local measurements of H0 to migrate towards the Planck value, just as we saw with measurements of the primordial deuterium abundance after the first CMB measurements were made nearly 20 years ago. They had it right until they were told they got it wrong so then they got it “new” right.

        Confirmation bias is a subtle thing. It’s not like we go out and say “I’m gonna confirm this result.” But if you know what you believe in your heart to be the “right” value, well, the numbers somehow seem to manage to follow the heart. Here, it may cut both ways.

        When I first saw the Freedman et al TRGB claim, this set off the concern I expressed above: the cynic in me muttered “and so the migration begins.” By which I mean local measurements of H0 gradually falling into line with what Planck tells them Must be so.

        TRGB is a great method, but I didn’t see how they were getting 70 out of it. TRGB is what calibrates most of the Tull-Fisher galaxies. If I use that in my own data, I get 75. (25 years ago, I got 72, so the calibration has been pretty stable.) I haven’t bothered writing this up because (1) I got enough to contend with as it is, and (2) 75 is exactly what Tully gets himself using the same method (https://arxiv.org/abs/1605.01765).

        That’s not to say Freedman et al got it wrong. I don’t know that. I have not examined their claim in detail, and have no plans to get in the middle of this particular fight. I just find it odd that they get a different number from their own Cepheid work (why should we believe this one over the previous one?), or from the TRGB work of others (e.g., Tully), especially when Riess had already shown the relative (not necessarily absolute) consistency of various local measurement techniques.

        But then Riess reanalyzes their data to get… the same answer he already got. Which may well be correct. But that’d also be where the confirmation bias flows… much as when Efstathiou (a leader of the Planck team) reanalyzed Riess’s data and found… exactly what Planck was finding.

        The article in Quanta is well written and reveals some interesting details, both scientific and sociological. Let me highlight one quote that seems off to me: “The Planck team predicts that the universe should expand at a rate of 67.4 km/s/Mpc.” The word “predicts” is emphasized in the article with an underline. I don’t blame the reporter for this, as he was likely being told this by some Impressive Cosmologist(s). But this isn’t a prediction at all. The use of the word “predict” here is a horrid abuse of its conventional meaning in the English language. Planck *fits* H0 as just one of many parameters. It is an indirect measurement – unlike the local measurements of distance and redshift, that directly measure the two axes of Hubble’s most famous diagram.

        In order to infer H0 from the Planck data, one not only has to fit simultaneously for all sorts of other parameters, one must also assume a model, in this case, LCDM. Part of the psychology that leads to calling Planck’s fit a “prediction” is an unquestioning faith in the underlying model. A less misleading statement would be that Planck requires H0 = 67.4 (+/- 1 or so) IF LCDM is the correct cosmology, and all other assumptions connecting their analysis to local observations hold.

        That local measurements consistent find H0 > 70 is what leads people to call this a “tension” or even a “crisis”. If we believe the expansion rate measured locally genuinely differs from that required by the global fit to the CMB as observed by Planck, then that means cosmology is broken in some subtly yet profound way. Which is to say, LCDM is wrong – or at least, not adequate to explain all data.

        No kidding.

        Like

Comments are closed.