How do 15-year-old pupils in England compare to other top performers across the world?

31 Dec

I        Overview

In mid-November 2017, the Organisation for Economic Co-operation and Development (OECD) released data on its “three yearly Programme for International Student Assessment (PISA)”. PISA consists of standardised tests in reading, writing and mathematics taken by students from different countries at the age of 15. The data is then used to compare the young people. Data is matched with how young people fared in examinations taken in their home countries.

John Jerrim and Nikki Shure of University College London Institute of Education produced an excellent analysis of how our English pupils performed, some of the key points of which are summarised below.

Altogether, 75 countries participated in PISA 2015, including all members of the OECD and the four countries within the United Kingdom. For the first time, China (previously limited to Shanghai) included four provinces – Beijing, Guangdong, Jiangsu and Shanghai. In England, PISA 2015 was conducted in November and December 2015, with a sample of 5,194 pupils in England from across 206 schools. The majority of England’s participating pupils were born between September 1999 and August 2000, meaning they came to the end of primary school during 2010, and were the last cohort to take the GCSE examinations before they were reformed.

The average science, mathematics and reading scores of pupils in England have not changed since 2006. Our 15-year-olds continue to perform significantly above the OECD average in science whilst they remain at the OECD average for mathematics. For the first time in 2015, pupils in England performed just above the OECD average in reading. Although there was no significant change in England’s absolute score, our performance relative to other countries changed since 2012 as they improved or declined around us.

The OECD average fell (but only significantly in science) meaning that England’s reading performance was now above average despite not having changed since 2012, and our relative science position increased compared to 2012 because other countries’ average scores dropped.

Whilst performance in England did not change there were changes in other parts of the United Kingdom, notably declines in average science performance in Scotland and Wales.

East Asian countries continued to dominate the top positions in PISA. Singapore topped the PISA 2015 in science, reading and mathematics. China (Beijing, Shanghai, Jiangsu and Guangdong) performed similar to England in science and reading. It continued to outperform England in mathematics.

The average science score in England (512) was significantly higher than in Northern Ireland (500) and Scotland (497). Pupils in each of these three countries achieved higher science scores than pupils in Wales (485). In reading and mathematics, average scores were similar across England, Northern Ireland and Scotland, with Wales behind the rest of the UK.

Whereas average scores remained stable in England and Northern Ireland since 2006, there was a sustained 20-point (eight months of schooling) decline in science scores in Wales. Similarly, there was a 15-point (six months of schooling) decline in PISA mathematics scores in Scotland between 2006 and 2015.

Socio-economic inequality in 15-year-olds’ science achievement, as measured by the relationship between pupil background and attainment, varied across the UK. Inequality in pupil outcomes was similar in England, Scotland and Northern Ireland. In Wales, however, the link between socio-economic status and performance in PISA was weaker. This was due to the comparatively weak academic performance of pupils from the most advantaged socio-economic backgrounds in Wales, relative to their equally advantaged socio-economic peers in England, Scotland and Northern Ireland.

The proportion of headteachers reporting inadequate or poorly qualified teachers or teaching assistants was similar in the UK to the rest of the OECD. Teacher supply was considered much less of a problem in Northern Ireland and Wales than it was in Scotland and England. Teachers not meeting individual pupils’ needs also stood out as a special concern to headteachers in England and Scotland; less so in Northern Ireland and Wales.

II       Science

On average, young people in England scored 512 on the PISA 2015 science test. England’s score remained broadly stable over the past decade since 2006 (when the average for England was 516 points).

(i)         England was among the high-performing OECD countries for science (having scored above the OECD science average every year since 2006). England maintained this performance in 2015, scoring higher than the OECD average of 493. In the PISA league table for science, England was in the 14th position.

(ii)        Science was England’s top PISA subject in 2015, with 15-year-olds scoring higher in science (on average) than in either reading or mathematics. This was also the case in 2012.

(iii)       Whilst English pupils showed no material improvement in their score since 2006, it was notable that very few other countries managed to increase their scores substantially over the same period.

(iv)       England had a greater proportion of top-performing pupils in science (12%) than the average across members of the OECD (8%). England’s top-performing pupils were amongst the world’s best 15-year-olds in science.

(v)        Whilst the strong performance of high-achievers helped to maintain England’s position in PISA science, there remained considerable inequality in science performance in England. The gap between the highest and lowest achieving pupils in science (at 264 points) was bigger in England than the average across industrialised countries (247 points). This was equivalent to more than eight years of schooling. Improving the basic science skills of low-achieving pupils is likely to be key to any future improvement in England’s average science scores.

III     Mathematics

(i)         In England, 15-year-old pupils scored, on average, 493 in the PISA 2015 test.  The average score has been stable since 2006.

(ii)        Altogether, 10 countries scored at least 20 points higher in mathematics than those in England, seven of which – including Singapore, Hong Kong and Macao – were East Asian.  Eight countries, all European, scored between 10 and 20 points higher than England.  Another seven countries, which were ahead, were within 10 points of England. This meant that in mathematics, England were 26th on the PISA league table.

(iii)       Pupils in several countries increased their scores since 2006 – unlike those in England.  Italy had the most rapid rise of 28 points when compared with the scores their pupils received in 2006.  Pupils in Portugal and Russia improved to such an extent over the last decade that they were at the same level as those in England.

(iv)       Pupils in other countries, including Finland and Australia, experienced a significant decline in their performance in mathematics since 2006.

(v)        England had a similar proportion of high-achievement pupils in mathematics (11%) as the average across members countries of the OECD.

(vi)       The gap between the highest and lowest achievement pupils in mathematics in England was 245 test points, which was equivalent to around eight years of schooling.  This was bigger than in most other countries. The OECD average points gap was 232.

(vii)      The gender gap in mathematics was also pronounced with boys achieving an average of 12 points higher than girls.  This was in contrast to reading, where girls did better, and science where girls and boys were equal.

IV     Reading

(i)         The average reading score for England’s 15-year-olds was 500, which was consistent with the average performance of 15-year-olds in the country over the last decade.  Because the OECD average declined slightly in 2015, pupils in England performed only just above the average for the first time and was ranked 19th in the PISA league table.   Singapore, Finland, Hong Kong, Ireland and Canada topped the list.

(ii)        Countries with a similar average reading score to English students included Australia, Taiwan, China and the United States.

(iii)       The countries which improved their reading scores most included Russia with 55 test points, Israel, Norway and Portugal.

(iv)       Top performers in England had scores of 625 or more, substantially above the national average of 500.  There were only seven countries – Singapore, New Zealand, Canada, Finland, South Korea, France and Norway – where the highest-achieving 10% of students had stronger reading skills than those in England.

(v)        In contrast, the lowest 10% of achievers in reading in England scored 371 points or below. There was a difference of around eight-and-a-half years of schooling between the highest and lowest achievers.

(vi)       In only seven countries was inequality of reading performance (as measured by the difference between the highest and lowest achievers) greater than in England.

(vii)      There was also some gender divergence in the scores with girls in England performing around nine months of schooling ahead of boys. However, this was the same as in most other countries.

There was great variation between the PISA scores of pupils in the South-East England compared to other pupils in the north with the former performing better. The report outlines that the gap reached, on average, “the equivalent of one GCSE grade” in science and reading.  The difference was most prominent between the South-East in comparison to the North-East, West Midlands, North West and Yorkshire and Humber.

In England, pupil performance in the PISA tests was mainly affected by “social economic status, “free school meal (FSM) eligibility, “gender” and “school type”.

V       The four countries in the UK

(i)         The average science score was highest in England (512) and lowest in Wales (485).  Scotland was 497 and Northern Ireland 500.

(ii)        The comparatively high science scores of pupils in England was across all elements of science.

(iii)       There were no significant differences between England, Scotland or Northern Ireland in mathematics and reading.  However, in Wales, 15-year-olds scored significantly lower than the rest of the UK in all three subjects.  In its lowest performing subject, reading, Wales sat on a par with Lithuania and Hungary.

(iv)       There was a sustained decline in average science scores in Wales from 505 points in 2006 to 485 in 2015.  The same was true for average mathematics scores in Scotland, which declined from 506 in 2006 to 491 in 2015.

(v)        Almost a third of pupils in the UK were low achievers in at least one subject (science, mathematics or reading). Wales had the greatest proportion of low-achieving pupils across the UK.

(vi)       Gender differences were similar across the UK with both genders scoring equally in science, boys scoring better in mathematics, and girls doing better in reading.

(vii)      However, there was a weaker association between socio-economic status and PISA science scores in Wales than the rest of the UK. This was driven by the most advantaged pupils in Wales not achieving as highly as their peers in England, Scotland or Northern Ireland.

(viii)     Headteachers in England were more likely to report teacher shortages being a significant problem compared to the rest of the UK.

(ix)       Across the UK, 15-year-olds spent more time studying science than English and mathematics. Pupils in Scotland, Wales and Northern Ireland reported spending over an hour more time studying outside of school per week (on average) than their peers in England.

The report of John Jerrim and Nikki Shure goes into considerable detail on other matters such as how low-level disruption affected the learning of our pupils, the variation of scores by pupil characteristics, the differences in achievements among schools, school management and resources, headteachers’ management of staff and pupils’ aspirations and future plans.

VI     What’s Next

So, there we have it till the autumn of 2018, when the next battery of tests in the three curricular areas will be administers to pupils in year 10.   English pupils did quite well albeit there was room for improvement.   The health warning, however, is that the PISA league table is limited in what it can measure on educational outcomes, i.e. essentially constrained to see how well 15-year-olds across OECD and other countries are doing in science, mathematics and reading.

Education is much more than these three subjects.   In his book, The Cubic Curriculum, the late Professor Ted Wragg, forcibly pointed out that the curriculum was at least three if not multi-dimensional.   The first dimension was concerned with the taught subjects, such as mathematics and music.  The second comprised the cross-curricular themes incorporated into the subjects, such as language and thought. The third dimension he described as the forms of teaching and learning deployed such as telling and discovering.

Measuring all these to assess the quality of education and the outcomes becomes daunting.   If you then accept that “education must incorporate a vision of the future”, as postulated by Professor Wragg, it becomes well-nigh impossible to measure how well a school/academy is doing on that dimension.

Add to this the importance of promoting creativity and lateral thinking in the curriculum.   Education systems around the world are increasingly being asked to produce creative students to ensure that our young people translate the robust vision of the future into reality.

The Organisation for Economic Cooperation and Development (OECD) began considering the assessment of creativity in 2011.  According to Helen Ward of The Times Educational Supplement, “there are strong rumours that this may finally be released in 2021”.

“We are now assessing the feasibility of assessing creative thinking,” Andreas Schleicher, director of education at the OECD, confirmed. “There were, indeed, many other options under consideration but creative thinking was the preference of most countries.”

While this may be the case, can it be done?  The challenges are immense.   How do you try to enjoy the essence of perfume forever and a day without releasing it into the atmosphere?  And once released, it dissipates.  That is creativity.  It has to be enjoyed rather than measured.

The problem with us in education is that we value everything we can measure rather than accepting that what is valuable in education – such as the ethos of a school/academy – cannot be measured. The OECD is determined to take up the challenge of measuring how well we do in education on one of the other most valuable aspects of education which till now has eluded assessment– creativity.  Watch that space.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: