Are our children really bad at mathematics? Will we have enough scientists and technologists in the future? Is our international competitiveness at risk from sub-standard schooling?
People in England could be forgiven for believing that our schools achieve lower standards in mathematics than those in almost any other country in the developed world. When the results of the Third International Mathematics and Science Study survey of 13 year olds were first published in 1996, the Times (3 July 1996) reported that “English pupils plummet in international maths league”. Several months later, when the results of the TIMSS survey of nine year olds came out, the Express (11 June 1997) reported: “We’re the dunces. English pupils are way behind in maths.” Most other papers carried similar headlines.
Yet pupils in England performed very well on the science tests. Out of 41 countries tested, only the 13 year olds in Singapore, the Czech Republic, Japan and Korea got significantly higher mean scores than the English; those in Sweden, the US, Germany, Switzerland, Hong Kong and France all did significantly worse; those in Hungary, the Netherlands and Australia were roughly on a par. In maths, though the English got significantly lower scores than 20 other countries, they were still on a par with children in Germany, the US, Norway and Denmark. The results for nine year olds were similar.
So there is a case for more headlines along the lines of “Primary school science pupils third [sic] in the world” (Daily Telegraph, 7 June 1997). As Jim Campbell of Warwick University commented: “Primary science is an unsung success story brought about by teacher enthusiasm and investigative methods, not by whole-class teaching.”
This is not to deny that our pupils are better at some things than at others. In tests of practical mathematics and science, which were taken by pupils in 19 of the countries, 13 year olds in England achieved the second highest mean score, surpassed only by pupils in Singapore. And in written maths tests, English 13 year olds were above the international average for data representation and analysis. On the other hand, they were well below the international average for fractions and number sense and algebra. Look at the examples on page 32, compare the English scores with the international ones and ask yourself which skills will be most important to the ordinary citizen in the 21st century. The danger is that, if we go “back to the basics” of number, as so many people advocate, we do so at the expense of the areas in which our children now do well.
But, it is argued, the real problem is our relatively poor performance in maths because this stops pupils going forward to study the sciences at A-level and beyond. Mathematics educators, particularly those in universities, have frequently expressed concerns about the low standard of maths among entrants to degree courses in the sciences, technology and engineering. The international tests failed to show up this problem because so few of the science items included any mathematics.
These are legitimate concerns. But we still have to ask questions about their significance. Is there a link between educational performance and economic success? Peter Robinson, of the Institute for Public Policy Research, argues that the link is by no means certain. He says, for example, that “in the international tests, students in the United States tend to come out with scores that are often close to those of the English students, which raises a puzzle because the United States remains the world’s most successful major industrial nation with the highest level of per capita GNP”. Furthermore, Robinson has demonstrated that countries’ mathematics and science attainment on the tests “were not correlated in any meaningful way with economic performance as measured by the level of or growth rate in per capita GNP”.
We also need to go beyond the international league tables and ask what particular aspects of our educational system explain particular results. For example, in the latest international tests, 13 year olds in England outperformed those in Hong Kong in science, whereas a decade earlier, in 1984, they performed at about the same level. One possible reason for this is the introduction of the national curriculum, making science a core subject in primary as well as secondary schools, in September 1989. Another possible reason is the English emphasis on practical activities in science. Reports from 13 year olds who took part in the tests suggest that practical activities in science are more frequent in England than in any other country.
Yet the survey found no pattern of teaching practices that distinguished the higher-achieving countries from the rest. For example, whole-class teaching – hailed by many experts as the key to improvement – was emphasised just as much in some of the lower-scoring countries as in some of the higher-scoring countries. This could be because effective teachers in all countries use a range of approaches tailored to match the topic they are teaching and the range of ability of the pupils in a particular class. Whole-class teaching, for example, is likely to be more effective in a class with a narrow range of ability.
It is also difficult to pin the blame for our nine year olds’ relatively poor performance in mathematics on the quality of teaching in primary schools. The tests of mathematics and science were taken by the same pupils. In most cases, they were taught both subjects by the same teacher.
Though there was some evidence that time spent in maths lessons and time spent on homework were positively related to achievement, the associations did not seem to be strong and the pattern did not hold across all countries. For example, nine year olds in the Netherlands, one of the higher-scoring countries in Europe in mathematics, were given maths homework less frequently than their counterparts in England.
Margaret Brown, of King’s College, London, has argued that the first two large-scale international comparisons of mathematics, which took place in the late 1960s and early 1980s, have had a greater influence on mathematics education world-wide than any other single factor. She says, for example, that “the [1960s study] league tables undoubtedly hastened the political imposition of the English national curriculum . . .” although, as she goes on to say, the results of this study did not suggest that countries with national curricula necessarily perform better.
After the latest international tests, confirming fears about English pupils’ weaknesses in number, the government introduced a numeracy hour to primary schools and put greater emphasis on mental arithmetic in the primary national curriculum.
But it is not just the English who fret over their results. Other countries sometimes revise their practices in precisely the opposite direction. Japanese pupils, for example, performed highly in the first two international comparisons of mathematics. Their scores for application and analysis (where pupils in England tend to perform well) were not as high as their scores for computation, however. As a result, the mathematics curriculum in Japan was changed to place more emphasis on the former. In Hungary the relatively poor performance of pupils in the international comparison of reading literacy, compared with their high scores in the comparisons of mathematics and science, led to a number of changes in the way reading was taught and an increase in curriculum time devoted to reading.
Clearly, the results of international comparisons can suggest ways forward. They can improve our understanding of our own educational system. They can inform the debate on reform by allowing us to see how different approaches work in other countries – always provided that we appreciate that what works in the context of one country may not work in another.
Until recently, however, with the notable exceptions of Japan and Sweden, most countries have not used the results of international comparisons in a systematic manner. We should use the results of TIMSS – and of a repeat survey of 13 year olds currently being undertaken in England by the National Foundation for Educational Research – in a considered and constructive way. We should identify and remedy our weaknesses, but we should also build on our strengths.
The writer was a principal research officer at the National Foundation for Educational Research