Accessibility links

The decisions of a single small exam board have caused upset and confusion in schools up and down England. But, in a reflection of deeper failings in our school system, no one seems to be answerable.

The West Midlands-based family of RSA Academies had a strong set of GCSE results overall this year, with performance meeting or exceeding expectation in most subjects. The notable exception to the pattern was English in those schools which took the iGCSE examination offered by Cambridge International Examinations (CIE). These results were significantly lower than expected.

As everyone was surprised, not to say shocked, by the English results, we’ve explored whether the experience of RSA Academies has been replicated in others who had taken the English IGCSE.

The simple answer is a loud and angry ‘yes’. We found many schools have similar trends, with results from their CIE iGCSE entrants wildly out of sync with both expected achievement and achievement in other subjects. For some schools, this meant a major decline in English pass rates, with the percentage of students achieving A*-C in English dropping over 20 percentage points on 2014 performance.

Other schools saw unexpected increases, or some students doing far better than expected in a cohort which otherwise underperformed. The overall picture is of inconsistency and confusion. For example, one senior manager in an Academy chain told us he had got two sets of English mock exams externally moderated to ensure as accurate a picture as possible, yet his students final exam grades were still hugely different than expected.

Unsurprisingly, many schools have responded to these dismaying results by sending scripts to be re-marked. But, although many od these papers are only one or two marks off a grade boundary, very few grades are being changed. One head teacher told us they sent off around 70 exam scripts for re-marking only for them to return ‘virtually unchanged’, an experience shared by others. Considering the controversy that these grades have generated this looks like a top down policy not a case by case assessment.

The independent sector has also been hit. Earlier this month, Chris King, the Head of the Headmasters’ and Headmistresses’ Conference (an association of the head teachers of 243 independent schools), described the exam marking system as ‘not fit for purpose’, with students being given ‘frankly unbelievable marks’. King identifies ‘a lack of full accountability by exam boards, an inadequate examiner workforce, a confusing and Byzantine appeals process’.

In response to a tide of anger, Cambridge International Examinations has sheltered behind the assertion that, overall, the proportions achieving different grades this year are similar to last year's. But this is simply to turn the problem into the explanation.

Our research strongly suggests that the national exam result tables soon to be published by the Department for Education will show that below the national headlines there have been major school-level fluctuations. Indeed, the root cause of the problem may lie in a major change in the number and make-up of the entrants for CIE exams.

In 2014, there were 121,000 candidates; in 2015, there were 202,000. The make up of entrants also seems to have changed substantially in character with more high performing schools opting for the exam. This is partly because the CIE examination uniquely retains aspects of speaking and listening and continuous assessment which are to be phased out by the Government by 2017. The change in the cohort is likely to have stretched the exam board’s capacity.

Beyond this, the OFQUAL requirements that grade boundaries stay fairly consistent (with significant rises in attainment politically frowned-upon as supposed evidence of ‘falling standards’), may have played a role by requiring exam boards to adjust grade criteria so they would generate the required norm-referenced distribution of attainment.

But, whatever the reasons behind this debacle, the effects are manifold and major. Secondary schools live or die based on how well their students do in English and Maths GCSE. Many iGCSE schools have seen their overall 5A*-C rate drop substantially. Most seriously, some will have dipped below the dreaded 40% floor level, which results in mandatory intervention from government.

Perfectly good head teachers and teachers may lose their careers and perfectly good schools may be forced to close or be merged as a direct consequence of the opaque decisions of a single small exam board.

The intense frustration felt by teachers who feel their students have been denied grades they deserve is palpable and, at a time of rapidly declining morale among educators, such disillusionment can be ill-afforded.

Ultimately, the biggest impact is on students themselves. Those who failed to achieve a C will now have to continue studying the subject, and will see their prospects in further education, higher education, or employment diminished.

The social justice implications of this batch of results are also substantial. The major impact is on students who were straddling the C/D boundary, who have been unceremoniously shoved into the D bracket. As a high proportion of these students are likely to be from low-income backgrounds there is a direct and regressive social impact, denying low-income students recognition and opportunities they have worked hard to secure.

Whether the background to these events lies in deliberate policy, unintended consequences, or organisational incompetence, the impact on the stability of schools, the careers of teachers and the life chances of low-income students are unambiguous. At the very least the people affected – especially the students - deserve an honest and credible explanation .


Join the discussion

Please login to post a comment or reply.

Don't have an account? Click here to register.