The media has made much of the ‘first overall drop in A level results for 30 years’. It seems quite a story, at first sight, but I feel I should add a voice of reason. The fall in passes at A* to E was 0.1%. In the grand scheme of things, the difference is well within the bounds of normal variability. It is not meaningful, in itself. Instead, what we saw last week was a remarkably stable set of results despite significant changes to the qualifications themselves. A level results have been stable for the last few years, as we have held standards steady, and this is perhaps the real story for A levels and GCSEs: there will be always be some variations year on year, reflecting changes in the mix of students and their subject and qualification choices. School results vary from one year to the next, as they are bound to do, but standards are maintained at a national level unless there is clear evidence to suggest they should change.
The variations at school level are normal, unavoidable and to be expected. Schools and colleges are used to year on year variations in their results - for example because of differences in their students’ abilities, year on year. While schools will see some variation in their results each year, the amount of variation is usually greater when qualifications change, as teachers and students are less familiar with them and adapt to the changes differently. Some findings from our research on variability in GCSE results in recent years can be found here.
We expected greater school level variation in A level results, because of the removal of January exams. Early indications are reassuring: so far we have identified small rather than big differences in patterns of variation, but we have more analysis to do in the coming weeks, so that we can all see how things actually played out. Some findings from our initial research on this summer’s A levels can be found here.
We make sure exam boards hold the bar steady for GCSEs at a national level, just as we do for A and AS levels. We expect nevertheless that GCSE results this year will look different to those of last year, for two reasons. First and foremost, there are significant differences in the student mix this year. There are significantly fewer GCSE entries from 15-year-olds; more students than ever have taken IGCSEs rather than GCSEs, and there have been changes in the mix of subjects that students take. These changes are likely to have noticeable effects on the pattern of results.
Secondly, the changes to the qualifications themselves are more far reaching and cover a broader range of subjects and students than at A level. In all subjects, students are now examined at the end of the course. In GCSE English and English Language, a greater proportion of the assessment is by examination, and students’ speaking and listening skills are being reported separately for the first time and no longer count towards the grade. This is also the first year of results for new and more comprehensive Geography specifications. Different schools will have been affected differently by these changes.
I have seen a lot of comments in the press about an expected ‘dip’ in results. We do expect the picture of results to look different this year to last, but in comparing them we will not be comparing like with like. Despite any apparent differences, we have made sure that exam boards hold standards steady. We do not think it would be right for students overall to be disadvantaged, or indeed advantaged, by changes made to the qualifications. There has also been speculation about grade boundaries. As I said in my last blog, grade boundaries are set by the exam boards at the end of each exam series. They can differ from one exam to the next, and frequently do. This is a legitimate part of making sure that standards are set properly and are comparable from one year to the next, despite whatever changes are taking place.
Glenys Stacey
Chief Regulator