Skip to main content

Blog The Ofqual blog

Organisations:
Ofqual

https://ofqual.blog.gov.uk/2014/09/29/work-ensure-best-quality-marking/

How we work to ensure the best quality of marking

Posted by: , Posted on: - Categories: A levels and GCSEs

You may have seen stories in the media recently about GCSE, AS and A level marking. I have been thinking about these and how to best enable people to form a fair view of the quality of marking.

First of all, it is useful to see marking in context.

Some 15 million scripts and about 100 million individual answers to exam questions are marked by over thirty thousand markers (mostly teachers) over a relatively short period each summer. Other countries face the same challenge – getting marking done to time and to standard – and tend to have similar approaches to ours, in that they rely on a small army of teachers and others to mark in tight time windows. What makes us stand out is the scale of things here. We looked recently at the detail of marking arrangements in Hong Kong. Overall, the arrangements are very similar to those here, albeit that in Hong Kong, markers go to designated centres to mark, rather than mark in their own homes. However, the operation is on a much smaller scale and that brings some advantages. With a few thousand rather than thirty thousand markers, it is clearly easier to pull together.

In recent years we have seen a significant transition. Most GCSE, AS and A level marking is now done on screen. Multiple choice questions can and are marked readily enough, and can be marked accurately without specific expertise in the subject or knowledge of the syllabus. For other types of questions, markers are sent student work electronically, rather than paper scripts. Sometimes, markers mark whole scripts but sometimes they mark bundles of specific questions. Electronic marking has its advantages: no loss of scripts in transit, increased turnaround times, the ability to target the most challenging questions to the most experienced markers, the ability to monitor markers and the quality of their work, as they work, and the chance to re-allocate work instantly if necessary.

Next, it is useful to look at the evidence available about the quality of marking here. We know that qualification grade changes represent just 0.6 per cent of all GCSE, AS and A level certifications. On this evidence, we might conclude that a little over 1 in 200 exam papers contained a marking error or inconsistency, but that assumes that all unchallenged grades are as correct as those challenged. We know that in live monitoring, the proportion of examiners that were themselves graded by exam boards as Grade 5 (the lowest) was less than 2%, but that does not demonstrate the quality of final marking – because it is a measure of markers rather than marking, and because rogue markers are rooted out by these live checks, and their work remarked ahead of results. In our view, there is not enough evidence available at the moment about the quality of marking (see our QoM report). We are requiring exam boards to agree the best metrics for measuring and reporting on marking quality; we will work towards the publication of an agreed overall reliability measure for their examinations. In future, we want to see more comparable information about marking quality.

If the vast majority of markers are found to be proficient, then why are schools and colleges concerned about marking? Incorrect marking has a significant impact on a student, even when a wrong mark is later corrected. One mistake undermines faith for quite some time, and understandably so. What is most disconcerting perhaps is that on rare occasions, an original grade is so far away from expectations and is then revised so radically that it makes the headlines and undermines confidence, and moreover traumatises the student concerned. In our experience, those significant changes in a grade are not generally down to rogue judgements on the part of a marker. Instead, they are usually due a system or process problem - for example, a marker mistakenly reporting a question mark as the mark for the whole script, or missing a part of a student's whole script.

We know also that as exam boards upgrade their IT and other systems, there is always a risk. Generally, the move to electronic marking has gone well, but all systems and system change carry risk and although we expect exam boards to manage those transitions and improvements well, and they generally do, it is inevitable that sometimes things do not run as smoothly as planned. Whatever the cause of a marking error, the effect is just the same on the student. In any large scale system, mistakes happen but we want them to be kept to a minimum, to be corrected quickly when they do happen and to be understood and explained - so that we can each know what happened and so that exam boards improve where they need to.

We are making changes that next year will enable and require exam boards to categorise errors to show system and individual marker error, and make the information public so that all can see how such mistakes happen. And when significant system problems arise we require them to make changes to stop the same thing happening again. There is more to be done. We have research in hand looking at how best to design mark schemes, as one proven way to improve the quality of marking itself. Here we will be looking amongst other things at the particular challenges of marking question responses that show a high level of knowledge but do not necessarily answer the question set. More immediately, we are evaluating carefully the mark schemes exam boards are developing for new, reformed GCSEs, AS and A levels. And we are about to consult on proposals to improve the appeal system so as to make it more fair, more effective and more transparent. We will welcome your views.

Glenys Stacey
Chief Regulator

Sharing and comments

Share this page