Back in the 1980s, astronomers were caught up in a debate so huge, you could drive a universe through it. The point of contention was a number called the Hubble constant, which describes the rate at which the cosmos is expanding and, by extension, how much time has passed since the Big Bang: the slower the expansion rate, the older the universe.
On one side was Allan Sandage, the towering successor to Edwin Hubble at Mount Wilson Observatory, who calculated the age of the universe was roughly 20 billion years. On the other side was a group of apostates whose observations showed it was only about half that old. Tempers ran hot.
The Hubble constant is measured in arcane units (kilometers per second per megaparsec), so for brevity, the researchers would use just the number itself. “We measure the Hubble constant to be 100,” one of the upstarts would announce in a talk. “It’s 50,” Sandage would roar in response, ridiculing his colleagues for their flawed measurements. For me, a young reporter covering the field at the time, it was a revelatory display of scientific passion. The entire history of the universe was at stake!
Three decades later, a revived debate on the Hubble constant has our understanding of the universe hanging in the balance all over again. At least the old shouting matches are long settled. Vastly improved data from the Hubble Space Telescope and the Planck satellite showed that Sandage and his rivals were both wrong; the true age of the universe lies in between, at 13.8 billion years. With this new clarity, though, a new conflict has popped into view.