‘The Unknown Universe’: From Stuart Clark’s Fascinating Guide to Modern Cosmology

Share:

Modern cosmology has a civic duty to provide a guide for those of us who aren’t in control of its instruments or math. The ideal guide would be sensitive to observations both standard and radically new, and it would factor in the personalities of the scientists. It would be a well-written work of the humanities and the sciences. And it would be knowledgeable — both skeptical and open-minded — about the ongoing paradigm shift in our understanding of deep-space (of black holes, gravity, and even a possible multiverse). Such a guide, I’m sure, would look exactly like Stuart Clark’s The Unknown Universe. The introduction to Clark’s pathbreaking book is excerpted below. — Jonathon Sturgeon, Literary Editor

X

From the Introduction to The Unknown Universe: A New Exploration of Time, Space, and Modern Cosmology

The Universe we live in today is a hierarchy of shining structures. Stars are gravitationally bound to each other in rotating collections known as galaxies. Galaxies are gravitationally bound to each other in collections known as clusters, and the clusters are strung through space in filaments that make up the cosmic web. All of this magnificence grew from the minute density variations present in the microwave background.

These variations are therefore the essential starting point for computer programs, called models, that mimic the evolution of the Universe. Crudely, the trick is to take the microwave pattern and see if our understanding of physics can transform it into the cosmic web of today’s Universe.

The models themselves are mathematical recipes that take the laws of physics as their foundation and then add the ‘ingredients’ of the Universe. For cosmology, gravity is the essential law of physics. There are three other forces of nature, but these play only minor roles in shaping the Universe overall.

The model’s ingredients are given by six parameters. The first two are measured from the mottling in the microwave background. Parameter one is their amplitude: in other words, how great the variations in the gas density are across the Universe. The second is to do with the volume of space in which these variations occur. Some are small volume fluctuations, others are much larger. This parameter measures the difference in amplitude between the smallest and largest volumes.

Then we come to the contents of the Universe. A central theme of this book will be the path trodden by cosmologists in their attempts to define the average density of matter and energy in the Universe. It has proved to be anything but easy. To make their models work with any semblance of success, they have been forced to assume that the ordinary atoms making up the stars, planets and life consist of no more than 4 per cent of the total contents of the Universe. The other 96 percent of everything is in forms of matter and energy that are unknown to us. Worse than this, the calculations show that they are almost beyond our ability to detect directly. They call the unknown stuff dark matter and dark energy, and they infer its existence by measuring the movement of galaxies.

The majority of galaxies appear to be rotating too quickly, or moving away from us in space at an ever-accelerating rate. Hence, the cosmologists believe that they need dark matter to spin the galaxies faster, and dark energy to push them away from us more quickly. These three ingredients – atoms, dark matter and dark energy – can be summarized in just two parameters because they are all dependent on one another. If you known the proportion of any two, the other can be simply deduced.

The fifth parameter of the standard model of cosmology is related to when the first stars formed. This point in cosmic history lies beyond the reach of even our best telescopes as yet. It was a catastrophic event in which almost every hydrogen atom in the Universe was ripped apart because the newly formed stars released huge amounts of destructive ultraviolet light. It occurred after the release of the cosmic microwave background and determines how easy it is for the microwaves to travel uninterrupted through space.

The sixth and final parameter is the Universe’s expansion rate. This is known as Hubble’s constant, after the American astronomer Edwin Hubble, who published definitive evidence of the expansion of the Universe in 1929.

In a perfect scenario, cosmologists would measure each of these parameters using completely independent means, plug them into the model, and out would come an answer that perfectly matched the distribution of galaxies in today’s universe. In reality, it is not that easy. Some of the parameters can be measured; others have to be estimated.

Then there are the assumptions, such as the existence of dark matter and dark energy, and the little mathematical fudges that have to be put into the model to turn it into a calculation that can be solved. If one of these is wrong, then the model itself is wrong and what we thought we knew about the Universe would evaporate before our eyes.

Having said that, confidence in the standard model took a huge leap forward thanks to the work of a NASA spacecraft called WMAP, the Wilkinson Microwave Anisotropy Probe. It was a forerunner to Planck and launched in June 2001. The word ‘anisotropy’ is the technical term for the density variations in the Universe, and for nine years WMAP repeatedly observed these. It hugely improved the accuracy of the standard model’s first two parameters, and as a result improved the accuracy of the model by a factor of more than 68,000.

On the face of it, there seemed to be little doubt that the standard model must be substantially correct, and cosmologists began to trumpet their victory. The WMAP website lists ten achievements that follow from the use of WMAP data and the standard model. From the age of the Universe to the percentage of ordinary atoms, cosmology was said to have entered an era of ‘precision’. What was not mentioned on the website’s list of achievements were the data that the standard model struggled to explain.

WMAP had seen a hint that the mottling in one part of the sky was deeper than the standard model allowed. It was dubbed ‘the cold spot’ because the anisotropies can be translated into temperatures but the detection was so slight that some thought it could have been a bit of instrumental noise.

So a key question was: had Planck seen it too?

There were also more general concerns about the ingredients of the standard model: namely dark matter and dark energy. After decades of theoretical work and experimentation, no one has been able to conclusively detect a single piece of dark matter. As we will discuss in Chapter 7, the hints we have from the various detectors around the world are both confusing and contradictory.

The dark energy is even more mysterious. There is no natural candidate that springs from any physics we currently understand. Some of our current hypotheses, such as particle physics supersymmetry, were designed specifically to exclude such an energy. So, perhaps dark matter and dark energy are not real. Perhaps they are phantoms conjured into being by a deeper misunderstanding of the Universe. If so, the standard model will have to be replaced.

Yet none of these concerns were voiced by NASA astrophysicist and Nobel laureate John Mather. On the eve of the ESA press conference, he was quoted by the BBC as saying: ‘I’m hoping there’s something surprising there for them. If they just say, “Well, other people were right” – that’s not exciting; the last decimal places are never very interesting. What we want is some new phenomenon.’

Mather had won the 2006 Nobel Prize in Physics for his work on the microwave background radiation using a NASA spacecraft called COBE, the COsmic Background Explorer. A year later, Time magazine listed him as one of the 100 most influential people in the world. Now, he was in charge of the biggest space mission in the world today, the NASA-led James Webb Space Telescope, with its eye-watering $8 billion price tag. However, you looked at it, his opinion carried real weight.

It was a public reflection of an undercurrent I had encountered several times. A number of cosmologists had given me ‘off-the-record’ comments that Planck was a waste of money because WMAP had effectively allowed cosmologists to extract all the really useful information from the microwave background. The implication was clear: more precision would simply confirm what WMAP had already found.

The irony of Mather’s statement was in his dismissal of the ‘last decimal places’. He had shared the Nobel Prize with cosmologist George Smoot for their discovery of the cosmic blueprint, as revealed by the temperature anisotropies in the microwave background. Those anisotropies had been found in the last decimal places it was possible to extract from the data they had been using. The temperature of the gas in the Universe back then was around 3000 °C, whereas the blueprint is encoded in variations that are on average just 20 millionths of a degree from place to place. Yet, from this imperceptible temperature variation had sprung the galaxies, which each now contained between hundreds of thousands and hundreds of billions of individual stars.

Far from being irrelevant, the last decimal places to which you can measure often contain the most interest, because there you see the hints of what you don’t understand – all those tricky details that remain to be explained. The last decimal places are the reason scientists always want bigger, better, more precise technology.

More and more detailed observations are the bedrock of true science. They tell us what the Universe is actually like, not what a theoretician calculates it should be like on average. And in twenty-four hours, the world would know.

Nerves were on edge when the ESA press conference began. Those who could not attend in person watched via a live stream on the Internet. Twitter was abuzz.

To signal just how important the event was to the agency, the director general of ESA, Jean-Jacques Dordain, spoke first. In sombre tones and broken English, he said that Planck had revealed an ‘almost perfect’ universe. But what did he mean by ‘almost perfect’?

He left it to Professor George Efstathiou, of the University of Cambridge, UK, to explain. One of the foremost cosmologists, Efstathiou once held the same position in Oxford as Edmond Halley, the famous seventeenth-century astronomer.

At the beginning of the press conference, Efstathiou looked tense. His lips were pressed together into a thin line, his shoulders were hunched. When he started talking, the tension disappeared; he seemed at ease and fluent, speaking precisely, almost downbeat. He announced without fanfare that the screen now showed the most precise map of the microwave background that had ever been obtained. It was a gold mine of information, he said, even though ‘it may look a little like a dirty rugby ball or a piece of modern art’.

No one laughed and he ploughed on, assuring the audience that there were cosmologists who would have ‘hacked our computers or maybe even given up their children to get hold of a copy of this map’. Still no one laughed.

He said that the Planck map was incredibly exciting, but instead of saying why, he then gave a lecture on basic cosmology. Almost half an hour into the press conference, nothing new had been said. When he presented the conclusions it was little more than small tweaks to what was already known. There was about 5 percent ordinary matter instead of 4 percent, the proportion of dark matter to dark energy was a little different, the Universe was 80 million years older than we thought, making it 13.8 billion years rather than 13.7 billion. The overall conclusion, he said, was that the standard model of cosmology is an extremely good match to the Planck data.

Watching from my office at home, I was poised at the keyboard to write up the results for Across the Universe, my astronomy blog hosted on the Guardian newspaper’s website, and I was starting to feel anxious. I received an email from a friend, a senior UK science editor, saying, ‘If this is all they are going to say, this is a nightmare.’

Indeed, John Mather’s worst fears were coming true before our eyes.

Then it all changed. Efstathiou said, ‘But there are some issues, and this why we have described the science results as an almost perfect Universe.’

He began to stumble on his words; he looked down while he was speaking. He reiterated how good the standard model was at fitting the data, and added that he could have simply stopped there and said ‘cosmology is finished’. But rather hesitantly he pushed himself to say, ‘But because we’ve got such good fit to the data [overall], we should examine more critically what doesn’t seem to fit. We have to look at what hasn’t fitted because that is where there may be evidence of new physics.’

At last, the game was afoot. Here were the ‘new phenomena’ that Mather (and the rest of us!) wanted. We were about to step into the unknown.

Efstathiou explained that, on the largest scale of the Universe, the temperature fluctuations were smaller than expected and that such behaviour was impossible in the standard model of cosmology. Also, the average temperature fluctuations on one side of the sky were larger than on the other; again, that was forbidden by the standard model. Finally, as the accompanying press release* confirmed, but Efstathiou did not mention, the WMAP ‘cold spot’ had been seen, confirming its existence.

The quality of the detection removed any doubts about the reality of these anomalies. They were all real features of the primordial universe – and they were impossible to understand with standard thinking. There was no tweak that the Planck team had tried that could explain where these features were coming from. The message, according to Efstathiou, was that the Planck data showed ‘cosmology is not finished’.

In February 2015, Chuck Bennett, professor of physics and astronomy at Johns Hopkins University, and colleagues conducted a thorough comparison of the cosmological model derived from WMAP with that from Planck. Worryingly, they found that the two solutions are not consistent with each other – each described a different Universe. Clearly something is amiss somewhere. The two might not have been exactly correct, but they should have been consistent. The error is now under investigation: either one of the data sets has been calibrated incorrectly or the standard model is wrong.

But how can we make progress when the Planck image is just about the very best we can obtain of the microwave anisotropies, our primary source of information about the early Universe?

For all our achievements, do we yet live in an unknown universe waiting to be explored and understood?

Frankly, Douglas Adams could not have written it any better. It was the world’s 42 moment for real. Most cosmologists thought that the answer to Life, the Universe and Everything (by which I mean the origin of the Universe) would become clear from the Planck data, yet right now no one really knew what to make of it.

The majority think that all these little snags are merely the final details to be clarified, a little bit of scientific ‘i’-dotting and ‘t’-crossing, but a growing number think that they are signs that we are completely wrong about the Universe.

It is into those uncharted realms that this book will journey. The search for answers will take us into the most mysterious places in the Universe; it will take us into the hearts of black holes, the moment of the Big Bang, and to a confrontation with the very nature of reality itself.

And it all starts on England’s Great North Road, between London and Cambridge, in the latter decades of the seventeenth century.

This selection is excerpted with permission from Stuart Clark’s The Unknown Universe. Reprinted by arrangement with Pegasus Books. All rights reserved.