Assessment of pupils’ achievements and progress grows in complexity

27 Aug

I           School developments

At an institutional (meta) level, it is a self-evident truth that information on pupils’ progress and achievement is essential when embarking on an exercise to improve the quality of educational provision at a school.  Teachers use performance data  (and there is lots about) to set challenging targets as well as contribute to future planning so that they encourage pupils to exploit as fully as possible their potential.

School governors have three sources of data:

i.            RAISEonline (see here and here) –  a mass of information issued by Ofsted and the DfE in the second half of the autumn term – which analyses the results and progress of the pupils over the last few years setting the data out in a national context.

ii.            The Family Fischer Trust (FFT) analysis (see here), which sets out how well the pupils at the school are currently doing and what they should be attaining at the end of the key stages.

iii.            Ofsted dashboard (see here), where governors can access a summary of the data that are presented in RAISEonline.

A health warning about Ofsted’s dashboard at this point is apposite.  The website is a summary of end-of-key-stage results and does not provide detailed information about the progress that pupils are making year-on-year, something that inspectors scrutinise when they visit schools.  For this, governors rely on their headteachers to provide information so that it can be clinically scrutinised.

In partnership with the National Governors’ Association and the Wellcome Trust, the FFT has produced a Governors’ Dashboard (see here) – providing the bespoke information that enables governors to support and challenge their schools better.  In particular, there are data on the overall performance of the school, an analysis of the core subjects at primary level and the GCSE subjects at secondary, a simple chart showing the progress made by different pupils groups over three years, a breakdown of the ethnic groups of pupils subdivided into gender and figures on the attendance of pupils.

II          The national scene

The national scene is changing. Britain is to have at least two sets of GCSE examinations, with different grading systems.  The extant one will be retained in Wales, but England is set to have a new raft of subject curricula and GCSEs to test them.  It was rumoured that the revamped examinations could be known as Intermediate or I Levels to distinguish them from the non-reforms GCSEs in Wales.  However, when the official announcement was made, there was no mention of the proposed, new name.

The changes for England will include the following.

(a)        A* to G Grades will be replaced with grades from 8 (the top) to 1 (the bottom) and it will be tougher to attain the levels than is currently the case.

(b)        Modules will be abolished.  All subjects will be examined at the end of a two-year period of study.

(c)        There will be a drastic reduction in re-sit opportunities, with all repeats in the summer except for English language and mathematics, for which November re-sits will be permitted.

(d)        There will be a reduction in coursework, which is to be used only where exams cannot test certain skills or knowledge.

(e)        Tiered examination papers for students of different abilities will be retained only in mathematics and science.

A letter from Michael Gove, the Secretary of State for Education to Mr Leighton Andrews, Education Minister in Wales, and John O’Dowd, Education Minister in Northern Ireland, (leaked in May 2013) states that the three nations had to go their separate ways in examination regulations.

The reformed examinations will be known as GCSEs (England) adding to the current confusion where over 100 countries take the International GCSEs (IGCSEs) which have the A*-G grading system.

The reformed GCSE subject content consultation document invites stakeholders to ask questions and comment on the proposed content for

(i)         English Language;

(ii)        English Literature

(iii)       Mathematics;

(iv)       Science;

(v)        History;

(vi)       Geography;

(vii)      Modern Languages; and

(viii)     Ancient Languages

The proposals mention, among other things the following.

(a)        In English Literature, students will have to study a full Shakespeare play instead of only extracts.

(b)        In English Language, more credence in marking will be placed on correct spelling, punctuation and grammar

(c)        The study of British History will make up 40%, rather than 25%, of the syllabus.

(d)        The Geography syllabus will involve two types of fieldwork assessed by examination.

(e)        Only Practical Science will retain an element of coursework; otherwise all subjects will be tested with end-of-term (a term being two years’ study) examinations.

Syllabuses for the Arts and Physical Education will be published later this year.

The Department for Education sought views on the proposed subject content and assessment objectives for the GCSEs in English Language, English Literature, Mathematics, Biology, Chemistry, Physics, Combined Science Double Award, Geography and History) to be introduced for teaching from September 2015.  It also sought views on a draft content framework for Modern and Ancient Languages to be introduced for first teaching in September 2016.

The deadline for responding to the consultation document – which can be accessed at this address – was 20 August 2013.

The examination regulatory body also consulted on reforms to the examinations of these subjects to match the changes coming in, which can be downloaded here. The deadline for responses given was 3 September 2013.

On 23 August 2013, the GCSE results were published.  For the second consecutive year, they were slightly down.   This was blamed on the number of times students were entered before 16 for the same subjects.   Altogether, 507,568 sat at least one GCSE examination at 15 years of age for the first time and 89,353 students sat mathematics for the third time.   Two students had taken the examination eight times.   Schools and students are frantic about improving their grades.

The downward trend in 2013 for the different elements is set out in the table below.

Categories 2013 Results 2012 Results
Achieving 5 A* to Cs 68.7% 70.0%
Students entered for both, GCSEs and IGCEs 8.2% 8.3%
Total number of boys achieving 5A*s to Cs 63.7% 65.4%
Total number of girls achieving 5A*s to Cs 72.3% 73.3%
Total boys and girls achieving 5 A*s to Cs 68.1% 69.4%
Total achieving A and A* grades 21.3% 22.4%
Total achieving A* grades   6.8%   7.3%

III       The International Picture

The most powerful measure for assessing how well countries are doing at international level is the Programme for International Student Assessment (PISA) which is run by the Organisation for Economic Cooperation and Development (OECD).

In the 54th issue of Governors’ Agenda, we published the following information about PISA.

To date over 70 countries and economies have participated in PISA, albeit the OECD comprises 34 different countries.   The students chosen to take the tests in reading, mathematics and science come from mixed backgrounds.   In addition, the school headteachers and principals are invited to complete questionnaires to provide information on the backgrounds of the students and the manner in which the schools are run.  In some countries, parents, too, fill out separate questionnaires. 

The papers which students tackle test not so much the knowledge they have imbibed during their education but rather how they apply knowledge and whether they can solve (real-life) problems.  In 2000, the focus was on reading; in 2003 it was on mathematics and problem-solving; in 2006 it was on science and in 2009 on reading again.   The analysis of the assessments carried out in September 2012 is now well underway.  The tests are not directly linked to the school curriculum and provides context through the background questionnaires which help analysts interpret the results. 

Some of the best systems in the world are in Shanghai and Korea where children from poor backgrounds do as well as if not better than those from the more well-heeled classes. 

Shanghai, Korea, Finland, Hong Kong and Singapore were the top five countries/areas in the 2009 tests. The United Kingdom was in the 25th place (out of 65), below Canada, New Zealand, Japan, Australia, Estonia, Poland, the United States, Sweden, Ireland, France and Germany.  From the year 2000 to 2009, the UK dropped from seventh to 25th place in reading, eighth to 27th in mathematics and fourth to 16th in science.     

An analysis of the 2009 results revealed that

(1)        successful school systems provide all students, regardless of their socio-economic backgrounds, with similar opportunities to learn and systems that show high-performing and an equitable distribution of learning outcomes tend to be comprehensive, where teachers embrace diverse student populations through personalised educational pathways; 

(2)        in countries and schools where students repeat grades, overall results tend to be worse and socio-economic differences in student performance tend to be wider, suggesting that people from lower socio-economic groups are more likely to be negatively affected;

(3)        in countries where 15-year-olds are divided into more tracks based on their abilities, overall performance is not enhanced and the younger the age at which selection for such tracks first occur, the greater the differences in student performance by socio-economic background by the age of 15 without improved overall performance;

(4)        most successful school systems grant greater autonomy to individual schools to design curricula and establish assessment; 

(5)        after accounting for the socio-economic and demographic profiles of students and schools, students in the OECD countries who attend private schools show performance that is similar to that of students enrolled in public (maintained) schools; 

(6)        school systems considered successful spend large amounts of money on education and tend to prioritise teachers’ pay over smaller classes;

(7)        in more than half of all the OECD countries, over 94% of 15-year-old students reported that they had attended pre-primary (nursery) classes or schools; and

(8)        schools with better disciplinary climates, more positive behaviours among teachers and better teacher-student relations tend to achieve higher scores in reading. 

In July 2013, PISA came in for much criticism from a group of academics.   In a paper that he published, Professor Svend Kreiner, a statistician from the University of Copenhagen in Denmark, said that an inappropriate model was being used to calculate the PISA rankings. He challenged the reliability of the results, demonstrating how outcomes fluctuate significantly according to which questions were used.   He averred that in the 2006 results, the reading ranking of Canada could have been positioned anywhere between the 2nd and 25th places, Japan anywhere from 8th to 40th and the UK from 14th to 30th.

Dr Hugh Morrison from Queens University, Belfast, went further alleging that PISA’s system contained a “profound” conceptual error.

The questions used by PISA varied among countries and the students participating in the same assessments.   In 2006, for instance, half the students were not asked any reading questions but were allocated “plausible” reading scores to help calculate their countries’ rankings.   To work out these scores, PISA used the Rasch model, a statistical way of “scaling” up the results it does have.

Professor Kreiner stated that this model would be valid only if each of the participating countries were subjected to the same level of difficulty.  He alleged that his research demonstrated that the level of difficulties varied, adding that it was meaningless to compare reading in Chinese with reading in Danish.

Andreas Schleicher, deputy director for education and special adviser on education policy to the OECD’s secretary general, responded with a stinging riposte in an article in the Times Educational Supplement (TES) on 2 August 2013. (See here.)

He wrote: “…the range of possible ranks has never been a secret. Any assessment of the skills of people, whether it is a high-school exam, a driving test or an international assessment such as PISA, will have some uncertainty because the results depended on the tasks that were chosen, on variations in how the assessment was administered and even on the disposition of the person taking the test”.  Schleicher added that the goal of PISA was not to eliminate uncertainty but rather design instruments that provide robust comparisons of the performance of education systems “in ways that reflect any remaining uncertainty”.   The lay person could be forgiven if she/he stated that this was difficult to comprehend.

He went on to acknowledge that variations were par for the course because these could not be eliminated.   Teachers in different schools (leave alone countries) would be emphasising different aspects of learning. If the focus was on algebra rather than geometry in one institution, the students were more likely to do better in the first subject and not so well in the second.

Michael Gove, the secretary of state for education, needs to pay heed and take note of this academic spat because he has placed considerable store on the PISA rankings.   Now we know from Schleicher himself that the rankings can vary depending on the areas of knowledge/skills that are being tested.

IV        Reflections

Whatever the instruments that are used for assessing pupil progress and achievement in the educational disciplines, making judgements about the quality of education being provided by institutions on the basis of the standards reached and progress made by our youngsters will always be limited.  Education is about developing all aspects of the human condition.  We tend to value what is measurable but are unable to measure everything that is valuable.

PISA is now keen to measure pupils’ creativity and curiosity.   Because of globalisation, Scheicher considers it important to test how well our youngsters are able to make sound judgements and deal with uncertainty and ambiguity.   Wellington College in Berkshire is spearheading the “development of happiness” on the curriculum.   This adds to the long list of important, albeit, elusive aspects of education that currently we find difficult, if not impossible, to measure – such as initiative and creativity.

The list of famous people who did not have their higher education calibrated is long.  Bill Gates, the Microsoft creator, dropped out of Harvard without a degree to establish his software company.   Steve Jobs of Apple fame lasted at Reed College in Portland for only six months before propelling himself onto a career in ground-breaking technology.   Winston Churchill and the Duke of Wellington had no degrees.

Ernest Hemingway worked briefly as a reporter in Kansas City before leaving school to become an ambulance driver in the First World War.  Mark Twain was a printer’s apprentice at the age of 12.   Tom Stoppard, P.D. James and Charles Dickens do/did not have degrees.  The same went for Richard Branson, Thomas Edison, Abraham Lincoln, George Washington and John Lennon.     Lady Gaga dropped out of New York University’s Tisch School of Arts.

History, in due course, will make judgements on John Major (who left school with two O Levels), David Cameron, who attained a first at Oxford University and Boris Johnson, who secured only a higher second class honours degree – also from Oxford University.

On the other hand, less is known about the vast number people who fail to pursue education through school, college and university and end up in grief.  Notwithstanding, is there something that we have to learn that as important as measuring progress and achievement may be, they comprise only a small segment of the education circle of which we would do well not to lose sight.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: