UK Students’ progress in the Programme for International Student Assessments [PISA] frozen

1 Jan

(1)     What is PISA?

In December 2013, the Organisation for Economic Cooperation and Development (OECD) published the Programme for International Student Assessment’s (PISA’s) fifth survey based on a battery of tests carried out in 2012.   PISA assesses the competencies of a cross-section of 15-year-olds in reading, mathematics, science and problem-solving.  The focus this time was on mathematics.

PISA charts the extent to which 15-year-old students have acquired key knowledge and skills that are essential for full participation in modern societies. The assessment in the four areas does not just ascertain whether students can reproduce what they have learned but also examines how well they can extrapolate from what they have learned and apply that knowledge in unfamiliar settings, both, in and outside school. This approach reflects the fact that modern societies reward individuals not for what they know, but for what they can do with what they know.

Paper-based tests were used each lasting two hours. In a range of countries and economies, an additional 40 minutes were devoted to the computer-based assessment of mathematics, reading and problem solving.

Test items were a mixture of questions requiring students to construct their own responses and multiple-choice items. The items were organised in groups based on a passage setting out a real-life situation. Altogether, 390 minutes of test items were covered, with different students tackling variously combined problems.

Students answered a background questionnaire, which took 30 minutes to complete, that sought information about themselves, their homes and their schools and learning experiences.

(2)     The Participants

Altogether, 510,000 students from 65 countries participated – 34 OECD member countries and 31 partner countries and economies, representing more than 80% of the world economy. The students represented 28 million 15-year-olds globally.

They took paper-based tests that lasted two hours. The tests were a mixture of open-ended and multiple-choice questions organised in groups based on a passage setting out a real-life situation.

Students took different combinations of different tests. They and their school principals/headteachers also answered questionnaires to provide information about the students’ backgrounds, schools and learning experiences and about the broader school system and learning environment.


(3)     The Findings

Students from the United Kingdom came 26th in this international league overall – 26th in mathematics, 23rd in reading and 21st in science.  Fifteen-year-olds in Singapore, Estonia and Slovenia shot ahead despite UK spending more than the average on education.   The results reveal that since 2009 our position in this league table has flat-lined.

UK’s average score for maths was 494.  In reading it was 499. Both these results were broadly the same as the OECD averages for the subjects and placed the country on a par with nations such as the Czech Republic, France and Norway.

In science, UK’s teenagers scored 514 points, above the OECD average and similar to results in Australia, Austria, Ireland, New Zealand and Slovenia.

However, the results leave the UK lagging far behind leading nations including Shanghai and Hong Kong in China, Singapore, Korea and Japan.

 Shanghai-China had the highest score in mathematics, with a mean of 613 points – 119 points, or the equivalent of nearly three years of schooling, above the OECD average. Singapore, Hong Kong-China, Chinese Taipei, Korea, Macao-China, Japan, Liechtenstein, Switzerland and the Netherlands, in descending order of their scores, were the nine other countries/economies that made up the top ten performers in mathematics.

Of the 64 countries and economies with trend data between 2003 and 2012, 25 improved in mathematics. Between 2003 and 2012, Italy, Poland and Portugal increased their share of top performers and simultaneously reduced their share of low performers in mathematics.

Boys performed better than girls in mathematics in 37 out of the 65 countries and economies and girls outperformed boys in five countries.

Shanghai-China, Hong Kong-China, Singapore, Japan and Korea were the five biggest hitters in reading.  Of the 64 countries and economies with comparable data throughout their participation in PISA, 32 improved their reading performance.

On average, 8% of students were top performers in reading (Level 5 or 6). These students could handle texts that were unfamiliar in either form or content and conduct fine-grained analyses of texts. Shanghai-China had the largest proportion of top performers – 25% – among all participating countries and economies. More than 15% of students in Hong Kong-China, Japan and Singapore were top performers in reading as were more than 10% of students in Australia, Belgium, Canada, Finland, France, Ireland, Korea,

Between the 2000 and 2012 PISA assessments, Albania, Israel and Poland increased their share of top performers and simultaneously reduced their share of low performers in reading.

Between 2000 and 2012 the gender gap in reading performance – favouring girls – widened in 11 countries.

Shanghai-China, Hong Kong-China, Singapore, Japan and Finland were the top five performers in science in PISA 2012.  Between 2006 and 2012, Italy, Poland and Qatar, and between 2009 and 2012, Estonia, Israel and Singapore increased their share of top performers and simultaneously reduced their share of low performers in science.  Across OECD countries, 8% of students were top performers in science (Level 5 or 6). These students could identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations.

UK spends more on education – £59,889 per student between the ages of six and 15 – than the average across OECD countries, which is £50,951.  According to the PISA report, the expenditure per student can explain about 30% of the difference in average maths results between countries.  However, in broad terms, the moderate or high spending per pupil does not automatically equate to particularly high or low performance in the subject.

On the credit side, one in eight (12%) of UK teenagers were considered “top performers” in maths scoring the highest results.  This is a similar proportion to the OECD average. In the UK, around 9% were top performers in reading and 11% in science.

On the debit side, 22% were “low performers”, compared to the OECD average of 23%.  This means that, at best, these youngsters can solve simple maths problems. Around 15% were low performers in reading, along with 15% in science.

The results also showed that students from an immigrant background in the UK perform as well in maths as other students, whereas in many other OECD countries they score significantly lower.

(4)     Commentary

PISA results tells a story about what is possible in education by demonstrating what students in the highest-performing and most rapidly improving education systems can do. The findings are intended to allow policy makers around the world to gauge the knowledge and skills of students in their own countries in comparison with those in other countries, set policy targets against measurable goals achieved by other education systems, and learn from policies and practices applied elsewhere.

(a)     Out of the mouths of PISA’s officials

According to Angel Gurria, the OECD Secretary-General, “More and more countries are looking beyond their own borders for evidence of the most successful and efficient policies and practices. Indeed, in a global economy, success is no longer measured against national standards alone, but against the best-performing and most rapidly improving education systems.

“Over the past decade, the OECD Programme for International Student Assessment, PISA, has become the world’s premier yardstick for evaluating the quality, equity and efficiency of school systems. But the evidence base that PISA has produced goes well beyond statistical benchmarking.

“By identifying the characteristics of high-performing education systems PISA allows governments and educators to identify effective policies that they can then adapt to their local contexts.”

Mr Andreas Schleicher, the deputy director of Education at the OECD and the creator of PISA, said that the latest results could not be used to judge the Coalition Government’s education reforms.  “You couldn’t possibly see anything of what’s been done in the last couple of years.”


(b)        The blame game

Notwithstanding, following the publication of these results, our politicians did not waste time to engage in the blame game.

Education Secretary Michael Gove said: “These poor results show the last government failed to secure the improvements in school standards our young people desperately need.  Labour poured billions of pounds into schools and ratcheted up exam grades – yet our education system stagnated and we fell behind other nations.” He added that the performance “underlines the urgent need for our reforms”.

On the other hand, shadow education secretary Tristram Hunt said: “The PISA report is a big wake-up call. Eastern dominance centres on the importance that these high performing education systems place on the quality and status of the teaching profession as the central lever for driving up standards.

“This report exposes the failings of this Government’s schools policy: a policy that has sent unqualified teachers into the classroom and prevented effective collaboration between schools.”

However, the finger-pointing of who is responsible for our stagnant results is not confined to the politicians.  There is another spat going on between academia and PISA.  Academics are now questioning the credibility of the world’s most influential international education study. Researchers uncovered thousands of cases of identical information being submitted for different schools taking part in the last edition of PISA.

German and Canadian academics stated that their trust in PISA’s data was “heavily compromised” by what they found in the results of its school background questionnaires. But the OECD insisted that the data it used was “high quality”.

In July 2013, The Times Educational Supplement published claims from other academics that the statistical model used to calculate PISA’s headline rankings meant they were “useless”, “meaningless” and “utterly wrong”.

The research looked at 71 of the 74 countries that participated in PISA 2009.  The researchers could only find 16 countries where the data they examined appeared to be of high quality. They did not analyse actual test results, but examined the information collected that was used to put the results in context and draw wider conclusions in the PISA reports.

Ten countries were highlighted in the research for having particularly “questionable” data. Three of them were extreme cases, where the academics suggested that the responses to school questionnaires had actually been fabricated by the national research institutes gathering PISA data.

The researchers spotted hundreds of examples of schools where principals had ticked the questionnaire boxes in such an implausibly uniform way that they doubted the accuracy of the data.

They looked at three sections of the questionnaires, covering “school climate” – issues such as levels of teacher absenteeism and student disruption, resources levels, and management practices. For each question, principals were asked to tick a multiple-choice box to indicate the extent to which the problem affected the school. But the researchers discovered hundreds of examples of school leaders ticking the same box for every question.

“Being the guardians of the school’s image and reputation, the principals would be torn between providing factual and school-enhancing responses,” the researchers said in their paper. It also suggested that principals may not have had enough time to give considered responses and may not have trusted the survey’s anonymity.

Countries with significant examples of such “questionable” data included the UK and the US.

The study’s co-author Professor Jorg Blasius, from Bonn University in Germany, said the school information provided by principals was crucial if the PISA data was to be robust enough for the wider evaluations for which it was used.

Gabriel Sahlgren, research director at the UK’s Centre for Market Reform of Education, said: “This suggests that many conclusions from the PISA report are invalid, and a lot of academic research that has been based on the data from PISA is also called into question.”

Andreas Schleicher, made a robust riposte. “PISA data, both from the test and the questionnaires, (are) validated to high standards. This includes analysis to detect response biases in the questionnaires.

“Pending a more thorough review of the analysis in the unpublished research paper, our assessment is that the response patterns highlighted are in fact quite plausible and do not present evidence of falsification or cheating.

“It is also important to note that the school principal questionnaire responses are used in the analysis of the PISA test results; they do not have any bearing on the test results themselves.”

Writing in the TES, Rebecca Wheater, Research Manager for the National Foundation for Education Research (NFER) and UK’s national project manager for PISA 2012, contested the academics’ criticism in her support of Schleicher.  “Our evidence suggests that leaders in the UK take participation in PISA very seriously….A number of checks are made to ensure accuracy.  The responses of the headteachers and students to questions about similar issues are compared in England, Wales and Northern Ireland reports and tell a similar story….The vast majority of questions in the school questionnaire are multiple-choice, so it does not seem surprising that a number of headteachers might answer sections of the questionnaire similarly,” she wrote.

Can We Trust Survey Data? The Case of Pisa by Jörg Blasius and Victor Thiessen is expected to be published later in 2014.

(c)      What can we learn from these results?

The question that UK readers will want answers to is: “What more can we do to improve our own international standing in the educational league table?”   Tony Blair made “Education, Education, Education” his priority during his 12-year tenure as Prime Minister.   Michael Gove is bringing in dramatic reforms to ensure that young people are literate, numerate and have life skills by the time they leave school, with the measures being taken to change the curriculum and testing and examination systems.

However, a fundamental difference between regions and countries on the Eastern seaboard – like Shanghai and Hong Kong in China, Singapore and South Korea – and the UK and US is that education is a very much more valued commodity there than in the West.   Families are ambitious for their children and know that if they are to succeed, it (education) is the passport to a fruitful life.  In the UK, the welfare system, which is envied by the rest of the world, appears to have been detrimental to the aspirations of families who have come to rely on it and take the benefits for granted.

On the other hand, research into the lives of many youngsters in China, South Korea and Japan reveal that they live mainly to work, study and succeed in their examinations. Critics aver that they miss out on their childhood.  Even play is seen against the backdrop of winning and coming first.   Recently, we read the case of a bright 13-year-old student in West Beijing, who was told off by his teacher for not producing a good piece of work.  He returned to his single bedroom flat that he shared with his mother – depressed and dejected.  His mother was not in.  Laying down his satchel, he went to the top of the block, flung himself over the parapet and committed suicide.  When his mother discovered that he had taken his life, she was speechless and distraught; his teacher was devastated.  The 13-year-old had been a star pupil who felt that the only way out of temporary failure was death.

There could well be other factors responsible for the successes.   In China, families are smaller (with the one-child policy).  Parents rely on their children succeeding at school, are ambitious for them and prepare them for a time when these children will look after them in their old age.   It is also the key that opens the door to the room of affluence.   It is unsurprising, therefore, that youngsters in the Far East work much harder than those growing up on our shores.

According to Ann Mroz, the editor of the Times Educational Supplement, “East Asian and Chinese children score well regardless of socio-economic status. Think less Tiger mother and more an ambush of tigers…..”  The flip side of this is that these children have to endure a heavy workload: all-day school with private classes after school.

In Shanghai, there is considerable rote learning and 50 to a class. The education is discharged in a monolithic political and cultural milieu.  Children are unable to question and think for themselves and they have little or no time to play and enjoy their childhood.   Is this what we want for our children?

The next PISA survey in 2015 will bring in significant changes.   Many more regions in China (rather than simply Shanghai and Hong Kong) will be participating so that there will be one score for that country as a whole.  The inclusion of the results of Chinese students from the provinces could lower the scores.  Shanghai is atypical of the rest of China.   Tom Loveless of Harvard University, writing for the think tank Brookings Institute, states that 84% of high school graduates in Shanghai move on to university.  Also, the per-capital Gross Domestic Product (GDP) in the city is more than twice that of China as a whole.  This wealth enables parents to spend more on private tuition than the China average.

China has other characteristics worth noting.  Even when PISA is taken by schools in the provinces, the results could be questionable because in some rural areas the attendance rate at secondary level is very low, i.e. 40%. Those who remain at school come from ambitious families and strongly committed to education.

Love it or hate it, PISA is with us to stay and should be used as a valuable benchmark to improve the provision we make for our youngsters, whatever our standing is in the international league table.  Sir Michael Barber, chief education adviser at Pearson (the publishers) and co-author with Saad Rizva of the Incomplete Guide to Learning Outcomes, identifies five learning points.

(i)         We must provide our schools with both, autonomy and accountability.  These must not be disentangled.

(ii)        We must invest in good teaching.   We have to recruit those who have the greatest academic and pedagogic talent from our universities. Further, the best teachers should be encouraged to go to our most challenging schools.

(iii)       We need to give equal attention to the above average, average and below average students.

(iv)       Providing pre-school education for our children is an investment for the future, not an expense.

(v)        We have to persist and not moan and groan every time we falter at a hurdle.   Success is 99% perspiration and 1% inspiration. (My words and not Sir Michael’s.)

PISA’s global reach has had an impact on not just the students of the 65 participating countries, but also those beyond.  The international tests have influenced Gove’s educational reform in England, and been responsible for the benchmarking to which our schools are subjected.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: