Arts, Science, Scotland

The life of Professor Peter Higgs

PIONEERING SCIENTIST

A SCIENTIST who achieved worldwide fame with the discovery of the so-called God Particle has died at the age of 94.

A statement from Edinburgh University, with which the scientist had a strong connection throughout his career, said that Professor Peter Higgs died “peacefully” at his home following a short illness.

The physicist, who had lived in the Scottish capital for more than 50 years, won the highly prestigious Nobel Prize in 2013 after he predicted the existence of an elusive fundamental particle – which would become known as the Higgs boson.

In 1964, he was one of the scientists who first proposed the existence of the sub-atomic particle that gives substance or mass to planets, stars, and life.

Paying tribute, Professor Sir Peter Mathieson, principal and vice-chancellor of the University of Edinburgh, said: “Peter Higgs was a remarkable individual – a truly gifted scientist whose vision and imagination have enriched our knowledge of the world that surrounds us. His pioneering work has motivated thousands of scientists, and his legacy will continue to inspire many more for generations to come.”

Fellow physicist Professor Brian Cox posted on social media: “I was fortunate enough to meet him several times, and beyond being a famous physicist – I think to his embarrassment at times – he was always charming and modest. His name will be remembered as long as we do physics in the form of the Higgs boson.”

Alan Barr, professor of physics at the University of Oxford, said: “From the mind of Professor Higgs came ideas which have had a profound impact on our understanding of the universe, of matter, and mass.”

He added: “He was also a true gentleman, humble and polite, always giving due credit to others, and gently encouraging future generations of scientists and scholars.”

Sir Ian Blatchford, chief executive of the Science Museum Group, said Professor Higgs was a “brilliant scientist who helped us to understand the fundamental building blocks of our universe”.

Born in Newcastle in 1929, Professor Higgs was the son of a BBC sound engineer.

His family later moved to Bristol where he attended Cotham Grammar School before going on to read theoretical physics at King’s College London. A five-decade-long career began when he graduated with a first-class honours degree in 1950.

Professor Higgs held research fellowships in Edinburgh and London before becoming a lecturer in mathematical physics at the University of Edinburgh in 1960. He wrote his ground-breaking paper after developing the theory while walking in the hills around the city.

In 1980, he became a professor of theoretical physics in Edinburgh, a post he held for 16 years before retiring and assuming the title of emeritus professor.

Outside academia, Professor Higgs married an American linguist, Jody Williamson. The couple had two sons, Christopher and Jonathan, before their divorce in the early 1970s.

The existence of the Higgs boson was proven in 2012 with use of the Large Hadron Collider (LHC) by a team from the European nuclear research facility in Geneva. Professor Peter Higgs was awarded the Nobel Prize along with Belgian Francois Englert for their work on the theory of the particle. At the time he said: “It’s very nice to be right sometimes.”

Despite the accolades he received – including more than ten honorary degrees – he said he felt uncomfortable being likened to other Nobel winners such as Albert Einstein.

Professor Higgs turned down the offer of a knighthood from Tony Blair in November 1999 as he did not want any title.

He lived in a small flat in Edinburgh, had no television, and used public transport. In later years, he told of his unease with the attention his achievement garnered, saying he was often bombarded by requests for selfies and could not walk the streets of Edinburgh without being stopped by fans.

The world is indebted to Professor Peter Higgs. May he rest in peace.

Standard
Arts, Books, History, Science

Book Review: Our Moon

LITERARY REVIEW

ACCORDING to astronaut Buzz Aldrin, the moon is mostly composed of greyish dust that smells of extinguished pyrotechnics which makes your eyes water.

Hardly an attractive environment, but we shouldn’t be fooled by first impressions. Were it not for the Moon, there would be no planet Earth. Or to be more precise, we wouldn’t even exist.

In exact and poetic prose, science writer and journalist Rebecca Boyle explains how many millennia ago – the timescale is still approximate – just how the Moon was formed from the same cosmic debris that made our world. Due to its gravitational impact, the Moon was responsible for pulling early fish-like creatures out of the Earth’s oceans and on to the shore.

It was from these that every creeping, crawling thing that inhabits our planet, including ourselves, developed. It is enough to give most of us nightmares.

The Moon is also Earth’s timekeeper. It continues to give us not only our days, but our months, seasons, and years. You may have thought that the Sun was in charge but, as the author explains, it is the pull of the Moon’s gravity on the Earth that holds our planet in place.

Without the Moon stabilising our tilt, at 23.4 degrees, we would wobble wildly and erratically, dramatically affecting our seasons and climate. In such a scenario, our planet would move from no tilt (meaning no seasons) to a large tilt (extreme weather and even ice ages). It is thanks to the Moon that the Earth remains a place that is more or less habitable – at least for now.

Prehistoric people weren’t aware of what went on in outer space, but they had worked out that the lunar cycle – the length of time it takes for the Moon to circle the Earth – governed not only their days but the seasons, too.

One of the most exciting passages in Rebecca Boyle’s book concerns the fairly recent discovery of 10,000-year-old pits dug near Crathes Castle, Aberdeenshire, in Scotland.

They are a sort of inverted or upside-down version of Stonehenge (but 5,000 years older), a Mesolithic lunar timepiece that allowed hunter-gatherers to work out which week in any year the salmon would be leaping in the River Dee, or when red deer might trot over the horizon.

And that’s not forgetting the influence on the regular arrival of new Mesolithic babies to be nurtured into a new generation of hunter-gatherers. Though more research needs to be done, it also looks that where there was not much natural daylight in communities in Northern Scotland, women tended to begin their menstrual cycles at the Full Moon.

This meant that they were most fertile at the New Moon, that dark time of the month when early man was less likely to be out hunting and gathering, and more likely to be at home making Stone Age love.

For those interested in testing this phenomenon, it just so happens that yesterday was a New Moon. Even now, in our age of electric light pollution, there is some evidence to suggest that women are still more likely to begin their monthly cycle at the Full Moon.

Boyle also investigates the old story about the links between the full moon and madness – the so-called “lunatic” effect. It turns out there is something in it: a 1990s survey reported that 81 per cent of mental health practitioners have observed a direct correlation between odd behaviour and certain times of the month.

At the very least, many of us find it hard to sleep when there is a full moon, which may well result in the kind of risky behaviour – driving too fast, drinking too much, yelling at annoying strangers – that lands many of us in A & E.

There is also emerging evidence that aneurysms are more likely to pop at either the Full or new Moon, thanks to the fact that it is at these points in its 29-day cycle that the Moon is most closely aligned with the Sun, which means that it exerts its strongest gravitational pull.

Given the extraordinary power that the Moon has on our everyday experience here on Earth, it is no wonder that earlier civilisations treated it not as a “withered, sun-seared peach pit”, to quote one early Apollo astronaut who orbited without landing, but as nothing less than a full-blown deity.

Particularly fascinating is the tale of Enheduanna, the Bronze Age high priestess, who used hymns to the Moon gods to bind the city-states of Sumeria into the world’s first empire.

There have been many books written about the Moon, but Rebecca Boyle’s feels especially timely. As the geo-political balance of our world shifts, the “space race” is being re-run with new players including Japan and India. This time around, however, the aim is not so much patriotic flag planting on the lunar surface, but economic advantage.

The Moon’s soil contains oxygen, silicon, aluminium, and iron, all of which can be refined into valuable things such as fuel, building materials and, ironically, solar panels.

Whichever nation manages to extract and exploit these first, will hold the balance of power in what is shaping up to be the next Cold War.

Our Moon: A Human History by Rebecca Boyle is published by Sceptre, 336pp

Standard
Health, Medical, Research, Science

Blood test that can detect Alzheimer’s 15 years before onset

ALZHEIMER’S DISEASE

A SIMPLE blood test can detect Alzheimer’s disease up to 15 years before symptoms begin, a major trial has found. It paves the way for a national screening programme.

The trial found that the test was as accurate as the current gold standard for diagnosing the condition.

For the first time, doctors were able to say if a person had a high, medium, or low chance, of having the disease – ruling out further invasive procedures.

Experts have said it would “revolutionise” diagnosis, making Alzheimer’s as easy to test and detect as for other routine health conditions such as high cholesterol.

Patients could expect results within days of visiting their GP, rather than the years it currently takes to get a diagnosis. This could have huge implications for future treatments, removing the barriers for a diagnosis – such as long waits for spinal taps or brain scans – and speeding up trials.

It could also pave the way for screening over-50s once more effective treatments become available.

Made by diagnostics company ALZpath, it was found to be 97 per cent accurate at detecting traces of the “tau” protein, which was linked to developing Alzheimer’s disease during the eight-year trials. These proteins start to build up on the brain 10 to 15 years before symptoms start showing.

Researchers in Sweden found high levels of the “tau” protein in the blood test corresponded to high levels of Alzheimer markers seen in expensive diagnostic brain scans and painful lumbar punctures.

The more of this leaked “tau” brain protein in the blood, the more likely or advanced the Alzheimer’s disease was in the tests involving 786 people. Growing evidence suggests biomarker changes like these can be detected in the blood years before other signs of the disease appear in the brain.

It means if scientists can find a way to stop these protein levels from rising, they could effectively halt Alzheimer’s in its tracks.

With breakthrough treatments such as donanemab and lecanemab on the horizon, experts say it is vital to have quick and reliable diagnoses. Professor David Curtis of University College London Genetics Institute said this was “one half of the solution”, while we await effective treatments.

He added: “This potentially could have huge implications. Everybody over 50 could be routinely screened every few years, in much the same way as they are now screened for high cholesterol.”

Around 900,000 people in the UK live with dementia – with Alzheimer’s the most common form. The growing ageing population means numbers are expected to rise to 1.6million by 2040, making a cheap screening tool vital to get to grips with the challenge.

Alzheimer’s Research UK analysis found 74,261 people died from dementia in 2022 compared with 69,178 a year earlier, making it the country’s biggest killer. While previous blood tests have shown promise, these findings have caused particular excitement given the high accuracy levels, large study size, and because the test already exists commercially.

It is also the first time a blood test has been found to be at least as good as a painful lumbar puncture or spinal tap for detecting elevated levels of the tau protein, according to the research team at the University of Gothenburg, Sweden.

Lumbar punctures involve taking fluid from the patient’s spinal cord. The inexpensive tests – priced at around £150 – could also be used to monitor a patient’s condition, allowing more tailored trials or treatment in future.

Dr Richard Oakley, of the Alzheimer’s Society, urged that more research would be needed, but said: “This study is a huge welcome step in the right direction as it shows that blood tests can be just as accurate as more invasive and expensive tests.

“It suggests results from these tests could be clear enough to not require follow-up investigations for some people living with Alzheimer’s disease, which could speed up diagnosis.”

The tests would need regulatory approval before widespread use. But they could form part of NHS trials starting imminently and looking to roll out blood tests for Alzheimer’s within the next five years.

The scientists’ findings were first published in JAMA Neurology.

Standard