The blast that wrecked my family’s Sunday morning 40 years ago remains my most spectacular, and certainly most memorable, involvement in chemical research. At around 8am, as the first weak rays of Glaswegian sunshine stole across my room and illuminated the walk-in cupboard that acted as my laboratory, there was a detonation of Verdun-like proportions. A cloud of ammonia-rich chemicals whipped across my room, spraying out pieces of laboratory glassware. My mother shrieked across the hall; my father raced into my room, cursing with unsuspected fluency; and I leapt from my bed as glittering fragments tinkled around me.
“Tri-nitrogen iodide,” I gabbled. “I was drying it out and sunlight must have ignited it. It’s used as a detonating agent, you know.” I could tell that my father was unimpressed with this information, because he merely repeated a number of quite specific threats to my physical well-being before he returned, muttering, to his bed.
My chemical “shenanigans” were well known, and were usually indulged, both at home and school. At the end of term, my friends and I would set off magnesium-weedkiller firecrackers; dump potassium permanganate into the school’s hot-water tank so that purple solutions would emerge, rather satisfyingly, from main-block toilet taps; or drop pieces of sodium metal – bought from a local chemicals warehouse – into drains, thus generating clouds of hydrogen that would ignite and send geysers of steam and water across the playground. Our teachers were amused, and occasionally impressed.
We were mere energetic amateur chemists, of course – though no one, ourselves included, was ever harmed by our antics. We also learned a great deal, because chemistry not only lets youthful practitioners make stinks and bangs, it lets them test things and record conclusions with instant – usually gratifying – results. It satisfies youthful inquisitiveness in a spectacular manner.
Or at least it used to, for thanks to a host of health and safety measures that have been introduced over the past decade, the nation’s youth is now denied such stimulation. It has become taboo to allow young people access to anything more harmful than a piece of litmus paper: the chemist who sold us sodium would be jailed and the teacher who turned a blind eye to our petty pilfering of his stock would be sacked. And jolly good, too, you might think. Can’t have our kids blowing themselves up. It’s common sense, isn’t it?
Well no, it isn’t, a point that has been made recently by a growing number of researchers – such as Sir Alec Jeffreys, the Leicester geneticist who discovered DNA fingerprinting. “I am a scientist today only because I was allowed to go through a period of significant danger to myself,” he says. “I had to grow a beard in later life because I burned my face with acid while mucking about with chemicals as a lad. But it was a risk I was willing to take. Doing these things let me satisfy my interest in the world around me. Children simply cannot do that now.”
Nor is the problem confined to chemistry. Take that great college perennial: the geology field trip. Students used to hike into the wilderness, armed only with a hammer (for breaking up samples) and a tent. Then it was decreed they must wear hard hats. “So we set off with some building-site helmets in our minibus,” says the geologist Ted Nield, a lecturer at University College Swansea at the time the decree was imposed. “Then the bus went round a bend, and the helmets fell off their rack and gashed three students’ heads. We hadn’t had an injury until then.”
Or consider an article I wrote recently for the Observer about preparations to celebrate Einstein Year by sending students to this summer’s Glastonbury Festival, where they plan, among other things, to set off rockets powered by Alka-Seltzers. (Mix them with water and use the power of the fizz. Simple.) I was deluged with e-mails from local officials demanding to know more about possible missile hazards to festival-goers. I was stunned. I had been writing about the Glastonbury Festival, home of drugs and rockers, yet these officials were worrying about the dangers of an exploding Alka-Seltzer.
In fact, the list of inane illustrations is almost endless, particularly those from school science classrooms, where it seems the problem is particularly insidious. They include: refusing to demonstrate small steam engines in case they should blow up; avoiding doing fractional distillation of crude oil (to show its different components) because of the dangers of causing cancer; and a ban on burning peanuts (to show their high calorific content) because someone might suffer a nut-allergy reaction. These are all real examples, though health officials say they do not seek such bans; it’s just nervous teachers who are overzealously interpreting the safety rule book. Such experiments are not officially forbidden, they insist.
And true enough, the Safeguards in the School Laboratory manual does not actually forbid laboratory antics, but equally it does nothing to encourage them. Indeed, it goes to great pains to stress the dangers of everything from working with detergent enzymes to the taking of samples of breath, which could cause “dizziness or fainting from forced breathing”. And in the case of the peanut-burning experiments, teachers are warned that the consequences of a student suffering a nut-allergy reaction “could be severe”.
Small wonder that teachers don’t bother. Who needs a lawsuit on top of crap pay and lousy working conditions? And in any case, does it matter? Young people today may not have the fun we had when studying science, but our generation didn’t have the Discovery Channel and the internet. So it all balances out, you could argue.
But it doesn’t. Take the issue of technical training. The government places great stress on Britain developing powerful knowledge-based industries, for which it will need a cadre of well-trained scientists. But where are they supposed to come from? Not our schools or universities, it would appear. “We get graduates coming into our laboratories who do not know how to weigh chemicals, measure liquids or take samples,” says Tim Hunt, the Cancer Research UK scientist and joint winner of the 2001 Nobel Prize for Medicine. “They should have been taught these techniques in school but haven’t because they are not allowed to go near chemicals, because of all the safety paranoia. So we have to teach them.” Thus many hours have to be taken from valuable postgraduate teaching to inculcate learning that should have passed on a decade earlier. This is scarcely the way to raise a generation of experts in white-hot UK technology.
The problem goes even deeper, however. In the past, when pupils asked “what happens if . . .?” during a science lesson, they might have been allowed to find out for themselves by being guided through an experiment. Today, their teacher will, at best, scribble a brief explanation on a blackboard before returning to the rigours of the class’s strictly delineated coursework. The chance of them indulging their natural curiosity is blocked and the opportunity to think freely and independently stymied. Thus intellectual curiosity is stifled by fear of health and safety legislation and court action, a point stressed by Jeffreys. “I am quite prepared to stick out my neck and let kids experiment with stinks and bangs. If a couple get maimed or even killed, that will be the price we have to pay for stimulating an interest in science and in getting young people to think for themselves.”
That will doubtless infuriate many. However, I suspect that most researchers support Jeffreys and share his concerns – particularly in the light of a recent national survey which revealed that only four out of ten British people consider themselves informed, in any way, about science. In other words, almost two-thirds of the population now say they know nothing about science, despite many acknowledging its importance to the country. Worse still, the gulf between the “knows” and the “don’t knows” is growing all the time – because people are being disenfranchised from involvement in scientific activities. It’s all too mysterious and dangerous for the likes of you, they are told. From this perspective, the prospects of creating a new generation of freethinking, technically minded researchers and academics do not look encouraging.
The trouble, as David Brooks has pointed out in the New York Times, is that we are living in the age of the lily-livered, where everything is a pallid parody of itself, from salt-free pretzels to the schooling of children amid foam-corner protectors and flame-retardant paper. Or in the case of the science lab, to the distant and anaesthetised demonstration.
Most critics stress the physical problems that we face as we raise a generation unexposed to risk. I would argue, along with Jeffreys, Hunt and the rest, that the real danger is the one posed to our intellectual well-being. Thinking and testing the world is a dangerous business and we should not shirk from it. My father would doubtless have been one of those who would disagree with me. To fail to think and test, however, will be much more damaging.
Robin McKie is science editor of the Observer