Hubble Constant: Gravitational-lensing measurements push Hubble-constant discrepancy past 5σ
• 2020: Little is known about what dark matter and dark energy, the dominant components of the universe, really are. But the standard model of Big Bang cosmology, known as ΛCDM (Lambda Cold Dark Matter), incorporates how they outwardly behave. Dark energy, the model presumes, takes the form of a cosmological constant Λ, a constant energy density per unit volume of vacuum. Dark matter, meanwhile, is nonrelativistic (or cold; the CDM stands for “cold dark matter”), and it interacts with itself and with ordinary matter only via gravity and possibly the weak force. 1)
With just a handful of free parameters, ΛCDM is appealing in its simplicity, and it generally agrees well with observations of the universe. But an exception is emerging in the Hubble constant H0', the universe’s present rate of expansion.
For ΛCDM to predict a value for H0, its free parameters must be constrained—for example, by a map of the cosmic microwave background (CMB), a picture of the spatial structure of the early universe. From 2009 to 2013, the Planck observatory measured the CMB with great resolution and precision; its map, combined with ΛCDM, yields an H0 of 67.4 ± 0.5 km/s/Mpc (Mega parsec ). 2) The structure of the early universe can also be inferred from the distribution of galaxies today (see the article by Will Percival, Physics Today, December 2017, page 32); that approach gives the same prediction for H0, albeit with wider error bars.
But H0 can also be calculated directly from the distances to various astronomical objects and the velocities at which they’re apparently receding from Earth. (See the article by Mario Livio and Adam Riess, Physics Today, October 2013, page 41.) And direct measurements disagree with the ΛCDM value. The SH0ES (Supernova, H0, for the Equation of State of Dark Energy) collaboration has been honing in on an H0 measurement using so-called standard candles: type Ia supernovae and Cepheid variable stars, whose luminosities are known. The team’s latest H0 value, 74.0 ± 1.4 km/s/Mpc (Megaparsec), differs from the ΛCDM value by 4.4 standard deviations. 3)
A difference of that magnitude, although unlikely to arise by chance alone, could still be due to some unappreciated systematic uncertainty in SH0ES’s methodology. That explanation, however, is starting to look much less likely in light of new work from the H0LiCOW (H0 Lenses in COSMOGRAIL’s Wellspring) collaboration, led by Sherry Suyu, which uses gravitationally lensed quasars to independently measure H0. In 2017 the collaboration published a first result based on three lensed quasars (reference [4)]; see also Physics Today, April 2017, page 24). The current work, with key contributions by Kenneth Wong and Geoff Chen, extends the analysis to six quasars.4 The result, 73.3 + 1.7 - 1.8 km/s/Mpc, agrees well with the SH0ES value. Combining the SH0ES and H0LiCOW measurements gives an H0 of 73.8 ± 1.1 km/s/Mpc, which is 5.3σ different from the ΛCDM prediction.
The theoretical basis for H0LiCOW’s method, called time-delay cosmography, dates back to a 1964 paper by Norwegian astrophysicist Sjur Refsdal. 5) But not until decades later did telescopes have the capability to implement it. When a quasar or other distant, luminous object lies directly in line with a massive foreground galaxy, its light can be so strongly bent that it appears to Earth-based observers as multiple images. Because the light in each image traverses a path of a different length and a different gravitational potential, as shown in Figure 1, any fluctuation in the quasar’s intensity shows up in the lensed images at different times.
Refsdal’s insight was that measuring those time differences (which are on the order of weeks) and the images’ angular deflections (on the order of arcseconds) provides crucial information about the absolute distances to the quasar and the foreground galaxy. The measurement doesn’t directly yield Dd (the distance from Earth to the galaxy), Ds (the distance from Earth to the quasar), or Dds (the distance from the galaxy to the quasar), but it does constrain their combination, which is enough information to calculate H0 from the objects’ known redshifts.
Earth-based telescopes suffice to resolve the lensed images and monitor their time delays. For years, the COSMOGRAIL (Cosmological Monitoring of Gravitational Lenses) collaboration has employed 1- to 2-m telescopes around the world to keep an eye on dozens of confirmed lensed quasars; some of the recorded light curves are shown in Figure 2.
Measuring the time delays is just one piece of the puzzle. Another crucial ingredient in the H0 calculation is the lensing galaxy’s mass distribution, which is needed to calculate the deflection angles (which can’t be directly measured, because the quasar’s true position on the sky is unseen) and the gravitational effects on the light travel time for each image. The mass distribution isn’t observable, but it can be modeled from the precise positions and shapes of the lensed images. An effective model requires high-resolution images from the Hubble Space Telescope.
It also requires some good judgement. The choice of when to stop adjusting the mass-distribution model requires a subjective assessment of how well the model agrees with the observed data. The H0LiCOW researchers worried that their decisions might be influenced, even subconsciously, by the value of H0 they were hoping to get. So they developed a technique of blind data analysis, whereby they could work on the model and test it against the data without ever seeing the distance or H0 values it would yield. They agreed beforehand that once they settled on a mass distribution that looked good, there was no going back: They’d publish whatever results it yielded with no further modifications.
“Because our analysis was blind, we could have gotten any result,” says Wong. “So it was a bit surprising to find that we were within 1σ of SH0ES.”
Off the distance ladder
SH0ES, meanwhile, has been tackling its own challenges in precisely determining H0. Type Ia supernovae, which are luminous enough to be seen at great distances, are extremely effective tools for measuring relative cosmic distances: Their peak luminosities are nearly all the same, so supernovae that appear dimmer must be farther away. From the slight curve in the relationship between their distances and velocities (inferred from their redshifts) came the Nobel-winning discovery that the expansion of the universe is accelerating (see Physics Today, December 2011, page 14). That determination, however, was made without knowing the absolute distance to any of the supernovae under study, so it didn’t yield a precise value of the present-day expansion rate H0.
To convert the relative distances into absolute ones, astronomers use a hierarchy of measurements called the cosmic distance ladder (see the article by Daniel Holz, Scott Hughes, and Bernard Schutz, Physics Today, December 2018, page 34). The distances to nearby objects, within a thousand parsecs or so, can be accurately measured using the geometric method of parallax. But supernovae of any type are rare events, and there hasn’t been one close enough to Earth for many hundreds of years. Cepheid variable stars can bridge that gap. They’re both numerous enough to be well represented near Earth and bright enough to be visible at the same distances as the nearest supernovae. As discovered by Henrietta Leavitt a century ago, Cepheids’ luminosities are related to their pulsation periods, so their relative distances can be inferred from their apparent brightness.
SH0ES has been shoring up the links between parallax, Cepheids, and supernovae, and other groups have checked and rechecked them. But there remained the possibility that some aspect of the underlying physics—of supernova evolution, Cepheid pulsation, or the telescopes used to observe them—wasn’t understood as well as astronomers thought it was.
It’s important, therefore, that H0LiCOW and SH0ES get the same answer from independent methods. SH0ES’s measurement has nothing to do with gravitational lensing or modeling of galaxy mass distributions, and H0LiCOW’s has nothing to do with the mechanisms of Cepheids or supernovae. If SH0ES’s result is marred by a systematic error, H0LiCOW’s analysis would have to coincidentally include a different error of almost exactly the same magnitude and sign.
In high-energy physics, a signal with statistical significance of 5σ is the threshold for claiming discovery of a new particle or effect. (See, for example, Physics Today, September 2012, page 12, and August 2019, page 14.) The statistical meaning of a 5σ result is the same in all contexts: Assuming a Gaussian distribution of measurement fluctuations, there’s a 1 in 3.5 million chance that the result could arise by statistical fluctuations alone, in the absence of any underlying effect.
But cosmologists so far have been reluctant to declare that the tension in H0 measurements must be a sign of physics beyond the ΛCDM model, in part because it’s not at all clear what that physics would be. There aren’t many ways the ΛCDM model could be modified that would both close the H0 gap and maintain the model’s agreement with all other measurements. Some of the possibilities theorists are exploring include dark radiation (relativistic dark particles, such as sterile neutrinos, whose wavelengths get stretched as the universe expands), non-Newtonian modifications to gravity, or a dark energy that’s not constant. But there’s no specific evidence, yet, of any of them, and a complete theoretical picture remains elusive.
The H0LiCOW researchers are working on adding more quasars to their analysis, with the goal of reducing their measurement uncertainty below 1%, or 0.7 km/s/Mpc. If their H0 value remains unchanged, such a measurement would be 5σ different from the ΛCDM on its own, independent of SH0ES or any other result.
Hubble Constant Measurements
• May 19, 2022: Spanning from 2003 to 2021, this collection of images from the NASA/ESA Hubble Space Telescope features galaxies that are all hosts to both Cepheid variables and supernovae. These two celestial phenomena are both crucial tools used by astronomers to determine astronomical distance, and have been used to refine our measurement of Hubble’s constant, the expansion rate of the Universe. 6)
- Establishing the distance of a celestial body is an enormous challenge for astronomers; it can be difficult to distinguish between objects that are dim and relatively close to the Earth and those which are bright and distant. To help overcome this challenge, astronomers have developed what is known as the cosmic distance ladder, a series of distance-determining methods, organised by the relative distances that they can measure. Two important steps in this ladder are Cepheid variables and supernovae: Cepheid variables because the period with which they pulsate can be used to calculate their distance; and supernovae because every type Ia supernova explosion reaches the same known luminosity, meaning that its brightness as viewed from Earth can be used to derive its distance. All the galaxies presented in this collection host Cepheid variables and have had at least one type Ia supernova explosion occur in them within the last 40 years. One of the galaxies, NGC 2525, even contained a supernova that was caught in real time in a remarkable timelapse.
- Even before it was launched, one of Hubble’s main science goals was to observe Cepheid variables and supernovae. These observations can help measure the expansion rate of the Universe, a value which astronomers call the Hubble constant. Generations of astronomers have refined this value over almost 30 years using data from more than 1000 hours of Hubble time. Most recently, a team of astronomers called SH0ES used observations of all the supernovae seen by Hubble in the last 40 years — including those in the galaxies pictured here — to determine the value of the Hubble constant as 73.04 ± 1:04 km s-1 Mpc-1.
- “This is what the Hubble Space Telescope was built to do. You are getting the standard measure for the Universe from the gold standard of telescopes,” said Noble Laureate Adam Riess of Johns Hopkins University in Baltimore, Maryland, who leads the SH0ES Team. “This is Hubble’s magnum opus.”
- Interestingly, the expansion rate determined from observational data from telescopes is significantly different from the value predicted by our current standard cosmological model of the Universe. The richness of the Hubble data means that this is vanishingly unlikely to have happened by a chance selection of misleading observations.
- The wide collection of Cepheid variable and supernovae-hosting galaxies observed by Hubble were picked out in six different proposals for observing time with the telescope. Whilst these proposals were part of Hubble’s decade-long quest to precisely measure the expansion rate of the Universe, the observations also produced a spate of beautiful galactic portraits, such as those of NGC 5643, NGC 7329, NGC 105 and NGC 3254. Still others have previously been featured in Hubble Pictures of the Week and other releases, including NGC 691, NGC 1559, NGC 2525, NGC 2608 and NGC 3147.
• May 19, 2022: Three Decades of Space Telescope Observations Converge on a Precise Value for the Hubble Constant. 7)
- Science history will record that the search for the expansion rate of the universe was the great Holy Grail of 20th century cosmology. Without any observational evidence for space expanding, contracting, or standing still, we wouldn't have a clue to whether the universe was coming or going. What's more, we wouldn't have a clue about its age either – or in fact if the universe was eternal.
- The first act of this revelation came when, a century ago, American astronomer Edwin Hubble discovered myriad galaxies outside of our home galaxy, the Milky Way. And, the galaxies weren't standing still. Hubble found that the farther a galaxy is, the faster it appears to be moving away from us. This could be interpreted as the uniform expansion of space. Hubble even said that he studied the galaxies simply as "markers of space." However he was never fully convinced of the idea of a uniformly expanding universe. He suspected his measurements might be evidence of something else more oddball going on in the universe.
- For decades after Hubble, astronomers have toiled to nail down the expansion rate that would yield a true age for the universe. This required building a string of cosmic distance ladders assembled from sources that astronomers have a reasonable confidence in their intrinsic brightness. The brightest, and therefore farthest detectable milepost markers are Type Ia supernovae.
- When the Hubble Space Telescope was launched in 1990 the universe's expansion rate was so uncertain that its age might only be 8 billion years or as great as 20 billion years.
- After 30 years of meticulous work using the Hubble telescope's extraordinary observing power, numerous teams of astronomers have narrowed the expansion rate to a precision of just over 1%. This can be used to predict that the universe will double in size in 10 billion years.
- The measurement is about eight times more precise than Hubble's expected capability. But it's become more than just refining a number to cosmologists. In the interim the mystery of dark energy pushing the universe apart was discovered. To compound things even further, the present expansion rate is different than it is expected to be as the universe appeared shortly after the big bang.
- You think this would frustrate astronomers, but instead it opens the door to discovering new physics, and confronting unanticipated questions about the underlying workings of the universe. And, finally, reminding us that we have a lot more to learn among the stars. - See the dazzling Hubble collection of Supernova host galaxies in Figure 3.
- Completing a nearly 30-year marathon, NASA's Hubble Space Telescope has calibrated more than 40 "milepost markers" of space and time to help scientists precisely measure the expansion rate of the universe — a quest with a plot twist.
- Pursuit of the universe's expansion rate began in the 1920s with measurements by astronomers Edwin P. Hubble and Georges Lemaître. In 1998, this led to the discovery of "dark energy," a mysterious repulsive force accelerating the universe's expansion. In recent years, thanks to data from Hubble and other telescopes, astronomers found another twist: a discrepancy between the expansion rate as measured in the local universe compared to independent observations from right after the big bang, which predict a different expansion value.
- The cause of this discrepancy remains a mystery. But Hubble data, encompassing a variety of cosmic objects that serve as distance markers, support the idea that something weird is going on, possibly involving brand new physics.
- "You are getting the most precise measure of the expansion rate for the universe from the gold standard of telescopes and cosmic mile markers," said Nobel Laureate Adam Riess of the Space Telescope Science Institute (STScI) and the Johns Hopkins University in Baltimore, Maryland.
- Riess leads a scientific collaboration investigating the universe's expansion rate called SHOES, which stands for Supernova, H0, for the Equation of State of Dark Energy. "This is what the Hubble Space Telescope was built to do, using the best techniques we know to do it. This is likely Hubble's magnum opus, because it would take another 30 years of Hubble's life to even double this sample size," Riess said.
- Riess's team's paper, to be published in the Special Focus issue of The Astrophysical Journal reports on completing the biggest and likely last major update on the Hubble constant. The new results more than double the prior sample of cosmic distance markers. His team also reanalyzed all of the prior data, with the whole dataset now including over 1,000 Hubble orbits. 8)
- When NASA conceived of a large space telescope in the 1970s, one of the primary justifications for the expense and extraordinary technical effort was to be able to resolve Cepheids, stars that brighten and dim periodically, seen inside our Milky Way and external galaxies. Cepheids have long been the gold standard of cosmic mile markers since their utility was discovered by astronomer Henrietta Swan Leavitt in 1912. To calculate much greater distances, astronomers use exploding stars called Type Ia supernovae.
- Combined, these objects built a "cosmic distance ladder" across the universe and are essential to measuring the expansion rate of the universe, called the Hubble constant after Edwin Hubble. That value is critical to estimating the age of the universe and provides a basic test of our understanding of the universe.
- Starting right after Hubble's launch in 1990, the first set of observations of Cepheid stars to refine the Hubble constant was undertaken by two teams: the HST Key Project led by Wendy Freedman, Robert Kennicutt and Jeremy Mould, Marc Aaronson and another by Allan Sandage and collaborators, that used Cepheids as milepost markers to refine the distance measurement to nearby galaxies. By the early 2000s the teams declared "mission accomplished" by reaching an accuracy of 10 percent for the Hubble constant, 72 plus or minus 8 km/s per megaparsec.
- In 2005 and again in 2009, the addition of powerful new cameras onboard the Hubble telescope launched "Generation 2" of the Hubble constant research as teams set out to refine the value to an accuracy of just one percent. This was inaugurated by the SHOES program. Several teams of astronomers using Hubble, including SHOES, have converged on a Hubble constant value of 73 plus or minus 1 kilometer per second per megaparsec. While other approaches have been used to investigate the Hubble constant question, different teams have come up with values close to the same number.
- The SHOES team includes long-time leaders Dr. Wenlong Yuan of Johns Hopkins University, Dr. Lucas Macri of Texas A&M University, Dr. Stefano Casertano of STScI and Dr. Dan Scolnic of Duke University. The project was designed to bracket the universe by matching the precision of the Hubble constant inferred from studying the cosmic microwave background radiation leftover from the dawn of the universe.
- "The Hubble constant is a very special number. It can be used to thread a needle from the past to the present for an end-to-end test of our understanding of the universe. This took a phenomenal amount of detailed work," said Dr. Licia Verde, a cosmologist at ICREA and the ICC-University of Barcelona, speaking about the SHOES team's work.
- The team measured 42 of the supernova milepost markers with Hubble. Because they are seen exploding at a rate of about one per year, Hubble has, for all practical purposes, logged as many supernovae as possible for measuring the universe's expansion. Riess said, "We have a complete sample of all the supernovae accessible to the Hubble telescope seen in the last 40 years." Like the lyrics from the song "Kansas City," from the Broadway musical Oklahoma, Hubble has "gone about as fur as it c'n go!"
- Weird Physics?
- The expansion rate of the universe was predicted to be slower than what Hubble actually sees. By combining the Standard Cosmological Model of the Universe and measurements by the European Space Agency's Planck mission (which observed the relic cosmic microwave background from 13.8 billion years ago), astronomers predict a lower value for the Hubble constant: 67.5 plus or minus 0.5 kilometers per second per megaparsec, compared to the SHOES team's estimate of 73.
- Given the large Hubble sample size, there is only a one-in-a-million chance astronomers are wrong due to an unlucky draw, said Riess, a common threshold for taking a problem seriously in physics. This finding is untangling what was becoming a nice and tidy picture of the universe's dynamical evolution. Astronomers are at a loss for an explanation of the disconnect between the expansion rate of the local universe versus the primeval universe, but the answer might involve additional physics of the universe.
- Such confounding findings have made life more exciting for cosmologists like Riess. Thirty years ago they started out to measure the Hubble constant to benchmark the universe, but now it has become something even more interesting. "Actually, I don't care what the expansion value is specifically, but I like to use it to learn about the universe," Riess added.
- NASA's new Webb Space Telescope will extend on Hubble's work by showing these cosmic milepost markers at greater distances or sharper resolution than what Hubble can see.
- The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Maryland, conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy in Washington, D.C.
• January 20, 2022: For the first time, scientists believe they have detected a merger of two black holes with eccentric orbits. According to a paper published in Nature Astronomy by researchers from Rochester Institute of Technology’s (RIT) Center for Computational Relativity and Gravitation (CCRG) and the University of Florida, this can help explain how some of the black hole mergers detected by LIGO Scientific Collaboration and the Virgo Collaboration are much heavier than previously thought possible. 9) 10)
- Eccentric orbits are a sign that black holes could be repeatedly gobbling up others during chance encounters in areas densely populated with black holes such as galactic nuclei. The scientists studied the most massive gravitational wave binary observed to date, GW190521, to determine if the merger had eccentric orbits.
- “The estimated masses of the black holes are more than 70 times the size of our sun each, placing them well above the estimated maximum mass predicted currently by stellar evolution theory,” said Carlos Lousto, a professor in the School of Mathematical Sciences and a member of the CCRG. “This makes an interesting case to study as a second generation binary black hole system and opens up to new possibilities of formation scenarios of black holes in dense star clusters.”
- A team of RIT researchers including Lousto, Research Associate James Healy, Jacob Lange ’20 Ph.D. (astrophysical sciences and technology), Professor and CCRG Director Manuela Campanelli, Associate Professor Richard O’Shaughnessy, and collaborators from the University of Florida formed to give a fresh look at the data to see if the black holes had highly eccentric orbits before they merged. They found the merger is best explained by a high-eccentricity, precessing model. To achieve this, the team performed hundreds of new full numerical simulations in local and national lab supercomputers, taking nearly a year to complete.
- “This represents a major advancement in our understanding of how black holes merge,” said Campanelli. “Through our sophisticated supercomputer simulations and the wealth of new data provided by LIGO and Virgo’s rapidly advancing detectors, we are making new discoveries about the universe at astonishing rates.”
- An extension of this analysis by the same RIT and UFL team used a possible electromagnetic counterpart observed by the Zwicky Transient Facility to compute independently the cosmological Hubble constant with GW150521 as an eccentric binary black hole merger. They found excellent agreement with the expected values and recently published the work in the Astrophysical Journal. 11)
• June 11, 2020: A new set of precision distance measurements made with an international collection of radio telescopes have greatly increased the likelihood that theorists need to revise the “standard model” that describes the fundamental nature of the Universe. 12)
The new distance measurements allowed astronomers to refine their calculation of the Hubble Constant, the expansion rate of the Universe, a value important for testing the theoretical model describing the composition and evolution of the Universe. The problem is that the new measurements exacerbate a discrepancy between previously measured values of the Hubble Constant and the value predicted by the model when applied to measurements of the cosmic microwave background made by the Planck satellite.
“We find that galaxies are nearer than predicted by the standard model of cosmology, corroborating a problem identified in other types of distance measurements. There has been debate over whether this problem lies in the model itself or in the measurements used to test it. Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem,” said James Braatz, of the National Radio Astronomy Observatory (NRAO).
Braatz leads the Megamaser Cosmology Project, an international effort to measure the Hubble Constant by finding galaxies with specific properties that lend themselves to yielding precise geometric distances. The project has used the National Science Foundation’s Very Long Baseline Array (VLBA), Karl G. Jansky Very Large Array (VLA), and Robert C. Byrd Green Bank Telescope (GBT), along with the Effelsberg telescope in Germany. The team reported their latest results in the Astrophysical Journal Letters. 13)
Edwin Hubble, after whom the orbiting Hubble Space Telescope is named, first calculated the expansion rate of the universe (the Hubble Constant) in 1929 by measuring the distances to galaxies and their recession speeds. The more distant a galaxy is, the greater its recession speed from Earth. Today, the Hubble Constant remains a fundamental property of observational cosmology and a focus of many modern studies.
Measuring recession speeds of galaxies is relatively straightforward. Determining cosmic distances, however, has been a difficult task for astronomers. For objects in our own Milky Way Galaxy, astronomers can get distances by measuring the apparent shift in the object’s position when viewed from opposite sides of Earth’s orbit around the Sun, an effect called parallax. The first such measurement of a star’s parallax distance came in 1838.
Beyond our own Galaxy, parallaxes are too small to measure, so astronomers have relied on objects called “standard candles,” so named because their intrinsic brightness is presumed to be known. The distance to an object of known brightness can be calculated based on how dim the object appears from Earth. These standard candles include a class of stars called Cepheid variables and a specific type of stellar explosion called a Type Ia supernova.
Another method of estimating the expansion rate involves observing distant quasars whose light is bent by the gravitational effect of a foreground galaxy into multiple images. When the quasar varies in brightness, the change appears in the different images at different times. Measuring this time difference, along with calculations of the geometry of the light-bending, yields an estimate of the expansion rate.
Determinations of the Hubble Constant based on the standard candles and the gravitationally-lensed quasars have produced figures of 73-74 km/s (the speed) per megaparsec (distance in units favored by astronomers).
However, predictions of the Hubble Constant from the standard cosmological model when applied to measurements of the cosmic microwave background (CMB) — the leftover radiation from the Big Bang — produce a value of 67.4, a significant and troubling difference. This difference, which astronomers say is beyond the experimental errors in the observations, has serious implications for the standard model.
The model is called Lambda Cold Dark Matter (ΛCDM), where “Lambda” refers to Einstein’s cosmological constant and is a representation of dark energy. The model divides the composition of the Universe mainly between ordinary matter, dark matter, and dark energy, and describes how the Universe has evolved since the Big Bang.
The Megamaser Cosmology Project focuses on galaxies with disks of water-bearing molecular gas orbiting supermassive black holes at the galaxies’ centers. If the orbiting disk is seen nearly edge-on from Earth, bright spots of radio emission, called masers — radio analogs to visible-light lasers — can be used to determine both the physical size of the disk and its angular extent, and therefore, through geometry, its distance. The project’s team uses the worldwide collection of radio telescopes to make the precision measurements required for this technique.
In their latest work, the team refined their distance measurements to four galaxies, at distances ranging from 168 million light-years to 431 million light-years. Combined with previous distance measurements of two other galaxies, their calculations produced a value for the Hubble Constant of 73.9 kilometers per second per megaparsec.
“Testing the standard model of cosmology is a really challenging problem that requires the best-ever measurements of the Hubble Constant. The discrepancy between the predicted and measured values of the Hubble Constant points to one of the most fundamental problems in all of physics, so we would like to have multiple, independent measurements that corroborate the problem and test the model. Our method is geometric, and completely independent of all others, and it reinforces the discrepancy,” said Dom Pesce, a researcher at the Center for Astrophysics | Harvard and Smithsonian, and lead author on the latest paper.
“The maser method of measuring the expansion rate of the universe is elegant, and, unlike the others, based on geometry. By measuring extremely precise positions and dynamics of maser spots in the accretion disk surrounding a distant black hole, we can determine the distance to the host galaxies and then the expansion rate. Our result from this unique technique strengthens the case for a key problem in observational cosmology.” said Mark Reid of the Center for Astrophysics | Harvard and Smithsonian, and a member of the Megamaser Cosmology Project team.
“Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision,” said Braatz.
Astronomers have various ways to adjust the model to resolve the discrepancy. Some of these include changing presumptions about the nature of dark energy, moving away from Einstein’s cosmological constant. Others look at fundamental changes in particle physics, such as changing the numbers or types of neutrinos or the possibilities of interactions among them. There are other possibilities, even more exotic, and at the moment scientists have no clear evidence for discriminating among them.
“This is a classic case of the interplay between observation and theory. The Lambda CDM model has worked quite well for years, but now observations clearly are pointing to a problem that needs to be solved, and it appears the problem lies with the model,” Pesce said.
The National Radio Astronomy Observatory is a facility of the NSF (National Science Foundation), operated under cooperative agreement by Associated Universities, Inc.
1) Johanna L. Miller, ”Gravitational-lensing measurements push Hubble-constant discrepancy past 5σ," Physics Today, Vol. 73, No 3, 2020, https://doi.org/10.1063/PT.3.4424, URL: https://physicstoday.scitation.org/doi/pdf/10.1063/PT.3.4424, Published by the American Institute of Physics
2) Planck Collaboration: N. Aghanim, Y. Akrami, M. Ashdown, J. Aumont, C. Baccigalupi, M. Ballardini, A. J. Banday,R. B. Barreiro, N. Bartolo, S. Basak, R. Battye, K. Benabed, J.-P. Bernard, M. Bersanelli, P. Bielewicz, J. J. Bock,J. R. Bond, J. Borrill, F. R. Bouchet, F. Boulanger, M. Bucher, C. Burigana, R. C. Butler, E. Calabrese, J.-F. Cardoso, J. Carron, A. Challinor, H. C. Chiang, J. Chluba, L. P. L. Colombo, C. Combet, D. Contreras, B. P. Crill, F. Cuttaia, P. de Bernardis, G. de Zotti, J. Delabrouille, J.-M. Delouis, E. Di Valentino, et al., ”Planck 2018 results. VI. Cosmological parameters,” Astronomy & Astrophysicsmanuscript no. ms, September 24, 2019, URL: https://arxiv.org/pdf/1807.06209.pdf
3) Adam G. Riess, Stefano Casertano, Wenlong Yuan, Lucas M. Macri, and Dan Scolnic, ”Large Magellanic Cloud Cepheid Standards Provide a 1% Foundation for the Determination of the Hubble Constant and Stronger Evidence for Physics beyond ΛCDM,” The Astrophysical Journal, Volume 876, Number 1, Published: 7 May 2019, URL: https://iopscience.iop.org/article/10.3847/1538-4357/ab1422/pdf
4) V. Bonvin, F. Courbin, S. H. Suyu, P. J. Marshall, C. E. Rusu, D. Sluse, M. Tewes, K. C. Wong, T. Collett, C. D. Fassnacht, T. Treu, M. W. Auger, S. Hilbert, L. V. E. Koopmans, G. Meylan, N. Rumbaugh, A. Sonnenfeld, C. Spiniello, ”H0LiCOW – V. New COSMOGRAIL time delays of HE 0435-1223: H0 to 3.8 per cent precision from strong lensing in a flat ΛCDM model,” Monthly Notices of the Royal Astronomical Society, Volume 465, Issue 4, March 2017, Pages 4914–4930, https://doi.org/10.1093/mnras/stw3006
5) Sjur Refsdal, ”On the Possibility of Determining Hubble's Parameter and the Masses of Galaxies from the Gravitational Lens Effect,” MNRAS (Monthly Notices of the Royal Astronomical Society), Volume 128, Issue 4, September 1964, Pages 307–310, https://doi.org/10.1093/mnras/128.4.307, Published: 01 September 1964
6) ”A dazzling Hubble collection of supernova host galaxies,” ESA Science & Exploration, 19 May 2022, URL: https://www.esa.int/ESA_Multimedia/Images/2022/05/A_dazzling_Hubble_collection_of_supernova_host_galaxies
7) ”Hubble Reaches New Milestone in Mystery of Universe's Expansion Rate,” NASA Hubblesite News, 19 May 2022, Release ID: 2022-005, URL: https://hubblesite.org/contents/news-releases/2022/news-2022-005.html
8) Adam G. Riess, Wenlong Yuan, Lucas M. Macri, Dan Scolnic, Dillon Brout, Stefano Casertano, David O. Jones, Yukei Murakami, Gagandeep S. Anand, Louise Breuva Thomas G. Brink, Alexei V. Filippenko, Samantha Hoffmann, Saurabh W. Jha, W. D’arcy Kenworthy, John Mackenty, Benjamin E. Stahl, and Weikang Zheng, ”A Comprehensive Measurement of the Local Value of the Hubble Constant with 1 km s-1 Mpc-1 Uncertainty from the Hubble Space Telescope and the SH0ES Team,” Draft version January 12, 2022, Typeset using LATEX default style in AASTeX631
9) ”RIT scientists confirm a highly eccentric black hole merger for the first time,” RIT News, 20 January 2022, URL: https://www.rit.edu/news/rit-scientists-confirm-highly-eccentric-black-hole-merger-first-time
10) V. Gayathri, J. Healy, J. Lange, B. O'Brien, M. Szczepańczyk, Imre Bartos, : Campanelli, S. Klimenko, C. O. Lousto, & R. Shaughnessy, ”Eccentricity estimate for black hole mergers with numerical relativity simulations,” Nature Astronomy, Published: 20 January 2022, https://doi.org/10.1038/s41550-021-01568-w
11) V. Gayathri, J. Healy, J. Lange, B. O'Brien, M. Szczepanczyk, I. Bartos, M. Campanelli, S. Klimenko, C. O. Lousto, and R. O'Shaughnessy, ”Measuring the Hubble Constant with GW190521 as an Eccentric black hole Merger and Its Potential Electromagnetic Counterpart,” The Astrophysical Journal Letters, Volume 908, Number 2, Published: 22 February 2022, https://doi.org/10.3847/2041-8213/abe388, URL: https://iopscience.iop.org/article/10.3847/2041-8213/abe388/pdf
12) ”New Distance Measurements Bolster Challenge to Basic Model of Universe,” NRAO News, 11 June 2020, URL: https://public.nrao.edu/news/challenge-model-of-universe/
13) D. W. Pesce, J. A. Braatz, M. J. Reid, A. G. Riess, D. Scolnic, J. J. Condon, F. Gao, C. Henkel, C. M. V. Impellizzeri, C. Y. Kuo, and K. Y. Lo, ”The Megamaser Cosmology Project. XIII. Combined Hubble Constant Constraints,” The Astrophysical Journal Letters, Volume 891, Number 1, Published: 26 February 2020, https://doi.org/10.3847/2041-8213/ab75f0
The information compiled and edited in this article was provided by Herbert J. Kramer from his documentation of: ”Observation of the Earth and Its Environment: Survey of Missions and Sensors” (Springer Verlag) as well as many other sources after the publication of the 4th edition in 2002. - Comments and corrections to this article are always welcome for further updates (email@example.com).