A “constant” that has wandered from about 500 to 50, then fractured into 63, 68, and 73, has no business being treated like settled cosmic law.

Modern cosmology uses the phrase Hubble constant with the tone usually reserved for bedrock physics. It sounds like the kind of number that belongs beside the speed of light. But the historical record is a humiliation, not a triumph of stability.

Edwin Hubble’s original estimate was around 500 km/s/Mpc. That was not a small miss. It implied a universe younger than the Earth. Later, the value came down dramatically. Over time, values around 180 appeared, then around 75, then even around 50-55. Today, the public is often told the real debate is only between the high 60s and low 70s. But that framing is too convenient. Very recent work already points to around 63, while other methods still point to around 68 and around 73. That is not one constant. That is a family argument being dressed as certainty.

And every time the number comes down, something quietly happens that should make every serious thinker uneasy:

The universe gets older.

That is the most devastating part of the story. Under the Lambda-CDM framework, a lower Hubble constant implies a larger inferred cosmic age. So the standard model’s age of the universe has not been discovered once and then defended. It has been repeatedly renegotiated as the so-called constant keeps being corrected downward. First the universe was absurdly young. Then it became older. Then older still. And if the downward drift continues, it will become older again.

That is not how a true constant behaves. That is how a model behaves when one of its central numerical supports is more fragile than its defenders admit.

BFUT offers a very different interpretation. It does not need to deny that many galaxies are redshifted. It does not need to deny that more distant galaxies often show stronger recession signatures. It only needs to deny the lazy leap from “pattern observed” to “space itself is expanding and this number is sacred.”

BFUT proposes gravitational sorting.

Imagine galaxies over immense timescales occupying many trajectories. Some collide. Some merge. Some are deflected. Some disappear as separate visible systems. The long-term survivors are increasingly biased toward galaxies on non-intersecting trajectories. From any local vantage point, the surviving set tends to look as though most galaxies are receding. Faster survivors naturally end up farther away. A Hubble-like relation can emerge without requiring expanding space.

Once you see that, the history of H0 looks different. Its instability is no longer a technical nuisance around a perfect law. It becomes exactly what one might expect if the number is not fundamental but emergent. Different methods probe different populations, different scales, and different selection effects. Disagreement is not noise around truth. It is evidence that the quantity may have been misclassified from the beginning.

This is why BFUT’s prediction matters. If H0 is not a true universal constant but a statistical property of gravitationally sorted populations, then independent methods should continue to yield different values. And if local measurements improve, they may continue trending downward toward the deeper underlying relation.

In other words, the crisis is not that the Hubble constant is “under tension.”

The crisis is that it may never have been a constant at all.

Download the research paper: https://doi.org/10.5281/zenodo.19149786 (doi.org in Bing) Download the simulation code: https://zenodo.org/records/19124510 Watch the simulation work: https://vijayshankarsharma.com/