Reports of the National Center for Science Education (RNCSE)
Science, Delusion and the Appetite for Wonder
You could give Aristotle a tutorial. And you could thrill him to the core of his being. Aristotle was an encyclopedic polymath, an all time intellect. Yet not only can you know more than him about the world. You also can have a deeper understanding of how everything works. Such is the privilege of living after Newton, Darwin, Einstein, Planck, Watson, Crick and their colleagues.
I'm not saying you're more intelligent than Aristotle, or wiser. For all I know, Aristotle's the cleverest person who ever lived. That's not the point. The point is only that science is cumulative, and we live later.
Aristotle had a lot to say about astronomy, biology and physics. But his views sound weirdly naive today. Not as soon as we move away from science, however. Aristotle could walk straight into a modern seminar on ethics, theology, political or moral philosophy, and contribute. But let him walk into a modern science class and he'd be a lost soul. Not because of the jargon, but because science advances, cumulatively.
Here's a small sample of the things you could tell Aristotle, or any other Greek philosopher. And surprise and enthrall them, not just with the facts themselves but with how they hang together so elegantly.
The earth is not the center of the universe. It orbits the sun — which is just another star. There is no music of the spheres, but the chemical elements, from which all matter is made, arrange themselves cyclically, in something like octaves. There are not four elements but about 100. Earth, air, fire and water are not among them.
Living species are not isolated types with unchanging essences. Instead, over a time scale too long for humans to imagine, they split and diverge into new species, which then go on diverging further and further. For the first half of geological time our ancestors were bacteria. Most creatures still are bacteria, and each one of our trillions of cells is a colony of bacteria. Aristotle was a distant cousin to a squid, a closer cousin to a monkey, a closer cousin still to an ape (strictly speaking, Aristotle was an ape, an African ape, a closer cousin to a chimpanzee than a chimp is to an orangutan).
The brain is not for cooling the blood. It's what you use to do your logic and your metaphysics. It's a three dimensional maze of a million million nerve cells, each one drawn out like a wire to carry pulsed messages. If you laid all your brain cells end to end, they'd stretch round the world 25 times. There are about 4 million million connections in the tiny brain of a chaffinch, proportionately more in ours.
Now, if you're anything like me, you'll have mixed feelings about that recitation. On the one hand, pride in what Aristotle's species now knows and didn't then. On the other hand an uneasy feeling of, "Isn't it all a bit complacent? What about our descendants, what will they be able to tell us?"
Yes, for sure, the process of accumulation doesn't stop with us. 2000 years hence, ordinary people who have read a couple of books will be in a position to give a tutorial to today's Aristotles: to Francis Crick, say, or Stephen Hawking. So does this mean that our view of the universe will turn out to be just as wrong?
Let's keep a sense of proportion about this! Yes, there's much that we still don't know. But surely our belief that the earth is round and not flat, and that it orbits the sun, will never be superseded. That alone is enough to confound those, endowed with a little philosophical learning, who deny the very possibility of objective truth: those so-called relativists who see no reason to prefer scientific views over aboriginal myths about the world.
Our belief that we share ancestors with chimpanzees, and more distant ancestors with monkeys, will never be superseded although details of timing may change. Many of our ideas, on the other hand, are still best seen as theories or models whose predictions, so far, have survived the test. Physicists disagree over whether they are condemned forever to dig for deeper mysteries, or whether physics itself will come to an end in a final 'theory of everything', a nirvana of knowledge. Meanwhile, there is so much that we don't yet understand, we should loudly proclaim those things that we do, so as to focus attention on problems that we should be working on.
Far from being over-confident, many scientists believe that science advances only by disproof of its hypotheses. Konrad Lorenz said he hoped to disprove at least one of his own hypotheses every day before breakfast. That was absurd, especially coming from the grand old man of the science of ethology, but it is true that scientists, more than others, impress their peers by admitting their mistakes.
A formative influence on my undergraduate self was the response of a respected elder statesmen of the Oxford Zoology Department when an American visitor had just publicly disproved his favorite theory. The old man strode to the front of the lecture hall, shook the American warmly by the hand and declared in ringing, emotional tones: "My dear fellow, I wish to thank you. I have been wrong these fifteen years." And we clapped our hands red. Can you imagine a Government Minister being cheered in the House of Commons for a similar admission? "Resign, Resign" is a much more likely response!
Yet there is hostility towards science. And not just from the green ink underlining brigade, but from published novelists and newspaper columnists. Newspaper columns are notoriously ephemeral, but their drip drip, week after week, or day after day, repetition gives them influence and power, and we have to notice them. A peculiar feature of the British press is the regularity with which some of its leading columnists return to attack science — and not always from a vantage point of knowledge. A few weeks ago, Bernard Levin's effusion in The Times was entitled "God, me and Dr. Dawkins" and it had the subtitle: "Scientists don't know and nor do I — but at least I know I don't know".
It is no mean task to plumb the full depths of what Mr. Bernard Levin does not know, but here's an illustration of the gusto with which he boasts of it.
Despite their access to copious research funds, today's scientists have yet to prove that a quark is worth a bag of beans. The quarks are coming! The quarks are coming! Run for your lives . . .! Yes, I know I shouldn't jeer at science, noble science, which, after all, gave us mobile telephones, collapsible umbrellas and multi-striped toothpaste, but science really does ask for it . . . Now I must be serious. Can you eat quarks? Can you spread them on your bed when the cold weather comes?
It doesn't deserve a reply, but the distinguished Cambridge scientist, Sir Alan Cottrell, wrote a brief Letter to the Editor:— "Sir: Mr. Bernard Levin asks 'Can you eat quarks?' I estimate that he eats 500,000,000,000,000,000,000 quarks a day."
It has become almost a cliché to remark that nobody boasts of ignorance of literature, but it is socially acceptable to boast ignorance of science and proudly claim incompetence in mathematics. In Britain, that is. I believe the same is not true of our more successful economic competitors, Germany, the United States and Japan.
People certainly blame science for nuclear weapons and similar horrors. It's been said before but needs to be said again: if you want to do evil, science provides the most powerful weapons to do evil; but equally, if you want to do good, science puts into your hands the most powerful tools to do so. The trick is to want the right things, then science will provide you with the most effective methods of achieving them.
An equally common accusation is that science goes beyond its remit. It's accused of a grasping take-over bid for territory that properly belongs to other disciplines such as theology. On the other hand — you can't win! — listen to the novelist Fay Weldon's hymn of hate against 'the scientists' in The Daily Telegraph.
Don't expect us to like you. You promised us too much and failed to deliver. You never even tried to answer the questions we all asked when we were six. Where did Aunt Maud go when she died? Where was she before she was born? . . . And who cares about half a second after the Big Bang; what about half a second before? And what about crop circles?
More than some of my colleagues, I am perfectly happy to give a simple and direct answer to both those Aunt Maud questions. But I'd certainly be called arrogant and presumptuous, going beyond the limits of science.
Then there's the view that science is dull and plodding, with rows of pens in its top pocket. Here's another newspaper columnist, A A Gill, writing on science this year in The Sunday Times.
Science is constrained by experiment results and the tedious, plodding stepping stones of empiricism . . . What appears on television just is more exciting than what goes on in the back of it . . . That's art, luvvie: theater, magic, fairy dust, imagination, lights, music, applause, my public. There are stars and there are stars, darling. Some are dull, repetitive squiggles on paper, and some are fabulous, witty, thought-provoking, incredibly popular...."
The 'dull, repetitive squiggles' is a reference to the discovery of pulsars in 1967, by Jocelyn Bell and Anthony Hewish. Jocelyn Bell Burnell had recounted on television the spine-tingling moment when, a young woman on the threshold of a career, she first knew she was in the presence of something hitherto unheard-of in the universe. Not something new under the sun, a whole new kind of sun, which rotates, so fast that, instead of taking 24 hours like our planet, it takes a quarter of a second. Darling, how too plodding, how madly empirical my dear!
Could science just be too difficult for some people, and therefore seem threatening? Oddly enough, I wouldn't dare to make such a suggestion, but I am happy to quote a distinguished literary scholar, John Carey, the present Merton Professor of English at Oxford:
The annual hordes competing for places on arts courses in British universities, and the trickle of science applicants, testify to the abandonment of science among the young. Though most academics are wary of saying it straight out, the general consensus seems to be that arts courses are popular because they are easier, and that most arts students would simply not be up to the intellectual demands of a science course.
My own view is that the sciences can be intellectually demanding, but so can classics, so can history, so can philosophy. On the other hand, nobody should have trouble understanding things like the circulation of the blood and the heart's role in pumping it round. Carey quoted Donne's lines to a class of 30 undergraduates in their final year reading English at Oxford:
"Knows't thou how blood, which to the heart doth flow,
Carey asked them how, as a matter of fact, the blood does flow. None of the thirty could answer, and one tentatively guessed that it might be 'by osmosis'. The truth — that the blood is pumped from ventricle to ventricle through at least 50 miles of intricately dissected capillary vessels throughout the body — should fascinate any true literary scholar. And unlike, say, quantum theory or relativity, it isn't hard to understand. So I tender a more charitable view than Professor Carey. I wonder whether some of these young people might have been positively turned off science.
Last month I had a letter from a television viewer who poignantly began: "I am a clarinet teacher whose only memory of science at school was a long period of studying the Bunsen burner." Now, you can enjoy the Mozart concerto without being able to play the clarinet. You can be a discerning and informed concert critic without being able to play a note. Of course music would come to a halt if nobody learned to play it. But if everybody left school thinking you had to play an instrument before you could appreciate music, think how impoverished many lives would be.
Couldn't we treat science in the same way? Yes, we must have Bunsen burners and dissecting needles for those drawn to advanced scientific practice. But perhaps the rest if us could have separate classes in science appreciation, the wonder of science, scientific ways of thinking, and the history of scientific ideas, rather than laboratory experience.
It's here that I'd seek rapprochement with another apparent foe of science, Simon Jenkins, former editor of The Times and a much more formidable adversary than the other journalists I've quoted, because he has some knowledge of what he is talking about. He resents compulsory science education and he holds the idiosyncratic view that it isn't useful. But he is thoroughly sound on the uplifting qualities of science. In a recorded conversation with me, he said:
I can think of very few science books I've read that I've called useful. What they've been is wonderful. They've actually made me feel that the world around me is a much fuller . . . much more awesome place than I ever realized it was . . . I think that science has got a wonderful story to tell. But it isn't useful. It's not useful like a course in business studies or law is useful, or even a course in politics and economics.
Far from science not being useful, my worry is that it is so useful as to overshadow and distract from its inspirational and cultural value. Usually even its sternest critics concede the usefulness of science, while completely missing the wonder. Science is often said to undermine our humanity, or destroy the mystery on which poetry is thought to thrive. Keats berated Newton for destroying the poetry of the rainbow.
Philosophy will clip an Angel's wings,
Keats was, of course, a very young man.
Blake, too, lamented:
For Bacon and Newton, sheath'd in dismal steel, their terrors hang
I wish I could meet Keats or Blake to persuade them that mysteries don't lose their poetry because they are solved. Quite the contrary. The solution often turns out more beautiful than the puzzle, and anyway the solution uncovers deeper mystery. The rainbow's dissection into light of different wavelengths leads on to Maxwell's equations, and eventually to special relativity.
Einstein himself was openly ruled by an aesthetic scientific muse: "The most beautiful thing we can experience is the mysterious. It is the source of all true art and science", he said. It's hard to find a modern particle physicist who doesn't own to some such aesthetic motivation. Typical is John Wheeler, one of the distinguished elder statesmen of American physics today:
[W]e will grasp the central idea of it all as so simple, so beautiful, so compelling that we will all say each to the other, 'Oh, how could it have been otherwise! How could we all have been so blind for so long!'
Wordsworth might have understood this better than his fellow romantics. He looked forward to a time when scientific discoveries would become "proper objects of the poet's art". And, at the painter Benjamin Haydon's dinner of 1817, he endeared himself to scientists, and endured the taunts of Keats and Charles Lamb, by refusing to join in their toast: "Confusion to mathematics and Newton".
Now, here's an apparent confusion: T H Huxley saw science as "nothing but trained and organized common sense", while Professor Lewis Wolpert insists that it's deeply paradoxical and surprising, an affront to commonsense rather than an extension of it. Every time you drink a glass of water, you are probably imbibing at least one atom that passed through the bladder of Aristotle. A tantalizingly surprising result, but it follows by Huxley-style organized common sense from Wolpert's observation that "there are many more molecules in a glass of water than there are glasses of water in the sea".
Science runs the gamut from the tantalisingly surprising to the deeply strange, and ideas don't come any stranger than Quantum Mechanics. More than one physicist has said something like: "If you think you understand quantum theory, you don't understand quantum theory."
There is mystery in the universe, beguiling mystery, but it isn't capricious, whimsical, frivolous in its changeability. The universe is an orderly place and, at a deep level, regions of it behave like other regions, times behave like other times. If you put a brick on a table it stays there unless something lawfully moves it, even if you meanwhile forget it's there. Poltergeists and sprites don't intervene and hurl it about for reasons of mischief or caprice. There is mystery but not magic, strangeness beyond the wildest imagining, but no spells or witchery, no arbitrary miracles.
Even science fiction, though it may tinker with the laws of nature, can't abolish lawfulness itself and remain good science fiction. Young women don't take off their clothes and spontaneously morph themselves into wolves. A recent television drama is fairytale rather than science fiction, for this reason. It falls foul of a theoretical prohibition much deeper than the philosopher's "All swans are white — until a black one turns up" inductive reasoning. We know people can't metamorphose into wolves, not because the phenomenon has never been observed — plenty of things happen for the first time — but because werewolves would violate the equivalent of the second law of thermodynamics. Of this, Sir Arthur Eddington said.
If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
To pursue the relationship between werewolves and entropy would take me too far afield. But, since this lecture commemorates a man whose integrity and honesty as a broadcaster is still an abiding legend 30 years after his death, I'll stay for a moment with the current epidemic of paranormal propaganda on television.
In one popular type of programming, conjurers come on and do routine tricks. But instead of admitting that they are conjurers, these television performers claim genuinely supernatural powers. In this they are abetted by prestigious, even knighted, presenters, people whom we have got into the habit of trusting, broadcasters who have become role models. It is an abuse of what might be called the Richard Dimbleby Effect.
In other programs, disturbed people recount their fantasies of ghosts and poltergeists. But instead of sending them off to a kindly psychiatrist, television producers eagerly hire actors to re-create their delusions — with predictable effects on the credulity of large audiences.
Recently, a faith healer was given half an hour of free prime time television, to advertise his bizarre claim to be a 2000 year-dead physician called Paul of Judea. Some might call this entertainment, comedy even, though others would find it objectionable entertainment, like a fairground freak show.
Now I obviously have to return to the arrogance problem. How can I be so sure that this ordinary Englishman with an unlikely foreign accent was not the long dead Paul of Judea? How do I know that astrology doesn't work? How can I be so confident that the television 'supernaturalists' are ordinary conjurers, just because ordinary conjurers can replicate their tricks? (spoonbending, by the way, is so routine a trick that the American conjurers Penn and Teller have posted instructions for doing it on the Internet! See http://www.randi.org/jr/ptspoon.html).
It really comes down to parsimony, economy of explanation. It is possible that your car engine is driven by psychokinetic energy, but if it looks like a petrol engine, smells like a petrol engine and performs exactly as well as a petrol engine, the sensible working hypothesis is that it is a petrol engine. Telepathy and possession by the spirits of the dead are not ruled out as a matter of principle. There is certainly nothing impossible about abduction by aliens in UFOs. One day it may be happen. But on grounds of probability it should be kept as an explanation of last resort. It is unparsimonious, demanding more than routinely weak evidence before we should believe it. If you hear hooves clip-clopping down a London street, it could be a zebra or even a unicorn, but, before we assume that it's anything other than a horse, we should demand a certain minimal standard of evidence.
It's been suggested that if the supernaturalists really had the powers they claim, they'd win the lottery every week. I prefer to point out that they could also win a Nobel Prize for discovering fundamental physical forces hitherto unknown to science. Either way, why are they wasting their talents doing party turns on television?
By all means let's be open-minded, but not so open-minded that our brains drop out. I'm not asking for all such programmes to be suppressed, merely that the audience should be encouraged to be critical. In the case of the psychokineticists and thought-readers, it would be good entertainment to invite studio audiences to suggest critical tests, which only genuine psychics, but not ordinary conjurers, could pass. It would make a good, entertaining form of quiz show.
How do we account for the current paranormal vogue in the popular media? Perhaps it has something to do with the millennium — in which case it's depressing to realize that the millennium is still three years away. Less portentously, it may be an attempt to cash in on the success of "The X-Files". This is fiction and therefore defensible as pure entertainment.
A fair defense, you might think. But soap operas, cop series and the like are justly criticized if, week after week, they ram home the same prejudice or bias. Each week "The X-Files" poses a mystery and offers two rival kinds of explanation, the rational theory and the paranormal theory. And, week after week, the rational explanation loses. But it is only fiction, a bit of fun, why get so hot under the collar?
Imagine a crime series in which, every week, there is a white suspect and a black suspect. And every week, lo and behold, the black one turns out to have done it. Unpardonable, of course. And my point is that you could not defend it by saying: "But it's only fiction, only entertainment".
Let's not go back to a dark age of superstition and unreason, a world in which every time you lose your keys you suspect poltergeists, demons or alien abduction. Enough, let me turn to happier matters.
The popularity of the paranormal, oddly enough, might even be grounds for encouragement . I think that the appetite for mystery, the enthusiasm for that which we do not understand, is healthy and to be fostered. It is the same appetite which drives the best of true science, and it is an appetite which true science is best qualified to satisfy. Perhaps it is this appetite that underlies the ratings success of the paranormalists.
I believe that astrologers, for instance, are playing on — misusing, abusing — our sense of wonder. I mean when they hijack the constellations, and employ sub-poetic language like the moon moving into the fifth house of Aquarius. Real astronomy is the rightful proprietor of the stars and their wonder. Astrology gets in the way, even subverts and debauches the wonder.
To show how real astronomical wonder can be presented to children, I'll borrow from a book called Earthsearch by John Cassidy, which I brought back from America to show my daughter Juliet. Find a large open space and take a soccer ball to represent the sun. Put the ball down and walk ten paces in a straight line. Stick a pin in the ground. The head of the pin stands for the planet Mercury. Take another 9 paces beyond Mercury and put down a peppercorn to represent Venus. Seven paces on, drop another peppercorn for Earth. One inch away from earth, another pinhead represents the Moon, the furthest place, remember, that we've so far reached. 14 more paces to little Mars, then 95 paces to giant Jupiter, a ping-pong ball. 112 paces further, Saturn is a marble. No time to deal with the outer planets except to say that the distances are much larger. But, how far would you have to walk to reach the nearest star, Proxima Centauri? Pick up another soccer ball to represent it, and set off for a walk of 4200 miles. As for the nearest other galaxy, Andromeda, don't even think about it!
Who'd go back to astrology when they've sampled the real thing — astronomy, Yeats's "starry ways", his "lonely, majestical multitude"? The same lovely poem encourages us to "Remember the wisdom out of the old days" and I want to end with a little piece of wonder from my own territory of evolution.
You contain a trillion copies of a large, textual document written in a highly accurate, digital code, each copy as voluminous as a substantial book. I'm talking, of course, of the DNA in your cells. Textbooks describe DNA as a blueprint for a body. It's better seen as a recipe for making a body, because it is irreversible. But today I want to present it as something different again, and even more intriguing. The DNA in you is a coded description of ancient worlds in which your ancestors lived. DNA is the wisdom out of the old days, and I mean very old days indeed.
The oldest human documents go back a few thousand years, originally written in pictures. Alphabets seem to have been invented about 35 centuries ago in the Middle East, and they've changed and spawned numerous varieties of alphabet since then. The DNA alphabet arose at least 35 million centuries ago. Since that time, it hasn't change one jot. Not just the alphabet, the dictionary of 64 basic words and their meanings is the same in modern bacteria and in us. Yet the common ancestor from whom we both inherited this precise and accurate dictionary lived at least 35 million centuries ago.
What changes is the long programs that natural selection has written using those 64 basic words. The messages that have come down to us are the ones that have survived millions, in some cases hundreds of millions, of generations. For every successful message that has reached the present, countless failures have fallen away like the chippings on a sculptor's floor. That's what Darwinian natural selection means. We are the descendants of a tiny élite of successful ancestors. Our DNA has proved itself successful, because it is here. Geological time has carved and sculpted our DNA to survive down to the present.
There are perhaps 30 million distinct species in the world today. So, there are 30 million distinct ways of making a living, ways of working to pass DNA on to the future. Some do it in the sea, some on land. Some up trees, some underground. Some are plants, using solar panels — we call them leaves — to trap energy. Some eat the plants. Some eat the herbivores. Some are big carnivores that eat the small ones. Some live as parasites inside other bodies. Some live in hot springs. One species of small worms is said to live entirely inside German beer mats. All these different ways of making a living are just different tactics for passing on DNA. The differences are in the details.
The DNA of a camel was once in the sea, but it hasn't been there for a good 300 million years. It has spent most of recent geological history in deserts, programming bodies to withstand dust and conserve water. Like sandbluffs carved into fantastic shapes by the desert winds, camel DNA has been sculpted by survival in ancient deserts to yield modern camels.
At every stage of its geological apprenticeship, the DNA of a species has been honed and whittled, carved and rejigged by selection in a succession of environments. If only we could read the language, the DNA of tuna and starfish would have 'sea' written into the text. The DNA of moles and earthworms would spell 'underground'. Of course all the DNA would spell many other things as well. Shark and cheetah DNA would spell 'hunt', as well as separate messages about sea and land.
We can't read these messages yet. Maybe we never shall, for their language is indirect, as befits a recipe rather than a reversible blueprint. But it's still true that our DNA is a coded description of the worlds in which our ancestors survived. We are walking archives of the African Pliocene, even of Devonian seas, walking repositories of wisdom out of the old days. You could spend a lifetime reading such messages and die unsated by the wonder of it.
We are going to die, and that makes us the lucky ones. Most people are never going to die because they are never going to be born. The potential people who could have been standing in my place but who will never see the light of day outnumber the sand grains of Sahara — more, the atoms in the universe. Certainly those unborn ghosts include greater poets than Donne, greater scientists than Newton, greater composers than Beethoven. We know this because the set of possible people allowed by our DNA so massively outnumbers the set of actual people. In the teeth of these stupefying odds it is you and I that are privileged to be here, privileged with eyes to see where we are and brains to wonder why.
There is an appetite for wonder, and isn't true science well qualified to feed it?
It's often said that people 'need' something more in their lives than just the material world. There is a gap that must be filled. People need to feel a sense of purpose. Well, not a BAD purpose would be to find out what is already here, in the material world, before concluding that you need something more. How much more do you want? Just study what is, and you'll find that it already is far more uplifting than anything you could imagine needing.
You don't have to be a scientist — you don't have to play the Bunsen burner — in order to understand enough science to overtake your imagined need and fill that fancied gap. Science needs to be released from the lab into the culture.