A level results come out next week, with GCSEs the week after, and many thousands of young people and their families are waiting in nervous anticipation.
In recent years, we at Ofqual have made a conscious decision to use this time ahead of results to talk openly about how the system works and to point out any trends or changes that are likely to mean that results may look different in any one year.
This year, we have spoken of some significant changes that came into effect this summer: a return to end of course exams in GCSEs; no January exams for A levels; a new structure for GCSE English and English Language, and strengthened GCSE Geography qualifications. Changes to the qualifications themselves can make a difference school by school, but we aim to hold national standards steady, subject by subject.
Increasingly, we are seeing noticeable changes in the number, mix and age of students taking GCSE exams – most especially in English, maths and the science subjects. We published provisional figures in May detailing the most significant changes this year. Such changes make a difference to how the results look overall. So for example, if a greater proportion of students take a particular GCSE subject in year eleven rather than year ten, national results may well rise slightly in that subject (because year eleven students generally do better than younger students). However, it is never quite that straightforward, because there will be other factors that come in to play – for example, a shift between GCSE and Level 1 and 2 Certificates, commonly referred to as iGCSEs.
All the more reason then for us to be as sure as possible about the strength of awarding in each qualification, and to be clear and open about how we set expectations and ensure a common and consistent approach in exam boards. We have explained our approach as regulator and we talk openly about what all of this is likely to mean for the results that come out over the next two weeks.
But I know that the way these issues are reported can increase the anxiety and worry for individual students and their parents, with speculation about pass rates and grade boundaries. But students and parents can have confidence that in amongst all these changes, we are making sure that standards are held steady. We believe that overall, students should not be disadvantaged (or indeed advantaged) because of changes to the qualifications, and the expectations we set for exam boards are based on that principle. You can read more about our approach here.
Of course, individual students will have responded differently to the changes in the qualifications. With the move away from exams and resitting over the two years, students may well feel less certain of success, but they can be reassured that our approach takes those changes into account: the bar has not been set higher this year than last, and so students who would have succeeded before these changes should succeed now, all other things being equal.
However, we can expect some schools to be affected more than others by the changes, because no two schools or teachers are the same, and they may have approached the changes in different ways. Increasingly though, the biggest influence on the national pattern of results year by year is the changes in the number, age and mix of students entered for each qualification.
While I am on this subject, I should just say that I was disappointed to read in the Sunday Times this weekend a reference to grade boundaries being ‘fiddled’. Quite frankly, such a comment is unhelpful, unwelcome and simply not true.
Grade boundaries are set by the exam boards at the end of each exam series. They can differ from one exam to the next, and frequently do, to reflect differences in the question papers and also to make sure that students are not disadvantaged when qualifications change. It is a legitimate part of making sure standards are set properly.
Please do look at the information on our website about how awarding works, and what we have already said about this summer’s results.
Glenys Stacey
Chief Regulator