Astronomer Royal Lord Martin Rees argues for the need for rigorous, scientific minds in our complex, rapidly changing world
By Martin Rees
I worry that many who would enjoy a scientific career are put off by a narrow and outdated conception of what’s involved. The word ‘scientist’ still conjures up an unworldly image of an Einstein lookalike (male and elderly) or else a youthful geek. There’s still too little racial and gender diversity among them.
And of course that was even truer in the past, when science was only open to the privileged few. But there’s huge variety in the intellectual and social styles of work the sciences involve.
They require speculative theorists, lone experimenters, ecologists gaining data in the field and quasi-industrial teams working on giant particle accelerators or big space projects.
The Royal Society was founded in 1660 by Christopher Wren, Robert Hooke, Samuel Pepys and other ‘ingenious and curious gentlemen’ (as they described themselves). Their motto was to accept nothing on authority. They did experiments, and they peered through newly invented telescopes and microscopes; they dissected weird animals. One experiment involved the transfusion of blood from a sheep to a man (who survived).
Enjoying this article? Check out our related reads:
There was an intermingling between the sciences – especially, in the 18th century, the fruits of exploration by James Cook, Joseph Banks and others – and the creativity of poets such as Samuel Taylor Coleridge and Percy Bysshe Shelley. There was no split between ‘two cultures’, but instead boisterous interactions between scientists, literati and explorers.
This burgeoning interest led, in the early 19th century, to the foundation of other ‘learned societies’. Some were specialised — such as the Linnean Society and the Royal Geographical Society — but one of them, the Royal Institution founded in 1799, genuinely rivalled the Royal Society. A hyper-talented but roguish adventurer, Count Rumford donated sufficient funds to provide a fine building in Albemarle Street in central London.
It was fortunate in the calibre of its first two directors, Humphrey Davy and Michael Faraday; both were outstanding scientists but also promoted ‘outreach’, mainly via weekly ‘discourses’ – prototypes of the current ‘popular science’ scene.
Of course, in the early 1800s, some technologies and crafts were already professionalised, having incrementally advanced over several earlier centuries: cathedrals, ships, clocks and bridges were built with a sophistication and elegance that still amazes us today. Steam engines were improved without formal input from the subject of ‘thermodynamics’. Far more intellectual effort was deployed on the ‘useful arts’, by hugely skilled craftsmen. Even the most capable medical practitioners – dependent on the dexterity of surgeons and the potions of apothecaries – could achieve little in an era before the invention of anaesthetics or the emergence of the ‘germ theory’ of disease.
Not until the mid-19th century did ‘curiosity- driven’ science become an established profession whose spin-offs have shaped the modern world and led to the emergence of university labs and research institutes. Scientific understanding has vastly increased; the frontiers are more extended and harder to reach. This has led to greater specialisation – ‘Balkanisation’ of the map of learning.
Scientists don’t fall into a single mould. Newton’s mental powers seem to have been really ‘off- scale’, and his concentration was as exceptional as his intellect; when asked how he cracked such deep problems, he said: ‘By thinking on them continually.’ He was solitary and reclusive when young; vain and vindictive in his later years. Darwin, by contrast, was an agreeable and sympathetic personality, and modest in his self-assessment. ‘I have a fair share of invention,’ he wrote in his autobiography, ‘and of common sense or judgment, such as every fairly successful lawyer or doctor must have, but not, I believe, in any higher degree.’
Darwin reminds us that the thought processes of most scientists are not intrinsically different from those of other professionals – nor indeed from those of a detective assessing the evidence at a crime scene. It’s simplistic to refer to ‘the scientific method’ – the methodology varies widely depending on the topic; these demand diverse styles of thinking and each attracts different personality types.
Some scientists see themselves as intellectuals, others as technocrats.
Science is a ‘work in progress’. Some theories are supported by overwhelming evidence; others are provisional and tentative. But however confident we may be in a theory, we should keep our minds ajar to the possibility that some intellectual revolution will offer a drastically different perspective.
Scientists tend to be severe critics of other people’s work. Those who overturn consensus to contribute something unexpected and original garner the greatest esteem. But scientists should be equally critical of their own work: they must not become too enamoured of ‘pet’ theories, or be influenced by wishful thinking.
For many of us, that’s a challenge.
Anyone who has invested years of their life in a project inevitably becomes strongly committed to its importance, to the extent that it’s a traumatic wrench if the whole effort comes to nought.
Ideas that are initially tentative sometimes firm up after intense scrutiny and criticism – for instance, the link between smoking and lung cancer, or between HIV and AIDS. Scrutiny and criticism is also how seductive theories get destroyed by harsh facts.
Noisy controversy on a scientific topic doesn’t mean the arguments are evenly balanced. Yet broad and open debate is undoubtedly the best route to clarity. That’s how science advances. Benign developments in communications mean that more people, worldwide, can participate in science – in particular, the best scientific journalists and bloggers are plugged into an extensive network and can help to calibrate controversies in climate science, and claims about (for instance) cold fusion or the dangers of vaccination.
Think you know your country flags? Score high enough and win a prize…
The path towards a consensual scientific understanding can be winding, with many blind alleys explored along the way. Occasionally a maverick is vindicated. We all enjoy seeing this happen – but such instances are rarer than the popular press would have us believe. Sometimes a prior consensus is overturned, but most advances transcend and generalise the concepts that went before, rather than contradicting them. Einstein didn’t ‘overthrow’ Newton, for instance. He transcended Newton, offering a new perspective with broader scope and deeper insights into space, time and gravity.
What about ideas ‘beyond the fringe’? As an astronomer, I haven’t found it fruitful to engage in dialogue with astrologers or creationists. We don’t share the same methods, nor play by the same evidence-based rules. We shouldn’t let a craving for certainty – for the easy answers that science can seldom provide – drive us towards the illusory comfort and reassurance that these pseudo-sciences appear to offer.
If you ask a scientist what they’re working on, they will rarely say that they are ‘trying to understand the universe’ or ‘curing cancer’. Their normal response is something very narrow and specific. They realise that the big questions are important, but they must be tackled in a step-by- step way.
Only cranks or geniuses try to solve the big questions in one go; the rest of us tackle a problem that’s bite-size and hope to make incremental progress that way.
Scholars in the humanities can fail to appreciate the genuine ‘creativity’ and imagination that the practice of science involves. But it can’t be denied that there are differences.
An artist’s work may not last, but it’s individual and distinctive. In contrast, even modest scientific advances are durable – adding to the accumulation of knowledge – even though their originators may be forgotten. If A didn’t discover something, in general B soon would; indeed there are many cases of near-simultaneous discovery.
That’s why scientists regard ‘priority’ as crucial: this is not just a feature of careerism in an overcrowded field. The great Newton wrote down an insight in code, to defend his priority without revealing his work to his rival Leibnitz.
Here there is indeed a contrast with the arts. As the great immunologist Peter Medawar remarked in an essay, when Wagner diverted his energies for ten years in the middle of the Ring cycle, to compose Meistersinger and Tristan, the composer wasn’t worried that someone would scoop him on Gotterdammerung.
Society now confronts difficult questions such as: Who should access the ‘readout’ of our personal genetic code? How will lengthening lifespans affect society? Should we build nuclear power stations – or wind farms – to keep the lights on without triggering dangerous climate change? Should we plant GM crops? Should the law allow ‘designer babies’ or cognition- enhancing drugs? Can we control AI?
These issues cannot be addressed without deploying scientific expertise. Moreover, no scientists should be indifferent to the fruits of their ideas – their creations. They should try to foster benign spin- offs, commercial or otherwise. They should resist, so far as they can, dubious or threatening applications of their work, and alert the public and politicians to perceived dangers.
Science is a global culture, able to cross barriers of nationality and faith, and many of the social and environmental challenges confronting us must be tackled globally.
Yet if public discussion is to rise above mere sloganeering, everyone needs enough of a ‘feel’ for science to avoid becoming bamboozled by propaganda and bad statistics. The need for proper debate will become more acute in the future, as the pressures on the environment and from misdirected technology get more diverse and threatening