Students should be able to freely access their marked exam papers. End of argument!

Over the past forever a central concern has driven examination reform:
Employers don’t think exams are doing a very good job of differentiating students and/or they are not providing the right skills.
But behind this claim usually lies a lot of ignorance. Employers are often surprisingly lacking in knowledge about the content of exams that they malign. They also tend not to have a clear understanding of the accuracy/argument level students must demonstrate to achieve particular grades. This particularly true for subjects they never personally studied. This is why those who studied at ‘traditional’ schools often hold mistaken beliefs about subjects such as media studies or psychology, and why people who never studied Latin or Classics commonly do the same.
So, here’s a very simple solution to these problems:
Give every student access to their completed GCSE/A-Level exam papers in an online hub. The student would log-in using a password and be able to see all their marked papers. This could then be an ‘exam portfolio’ the student could grant employer/university access to when applying for jobs.
This sounds like a hassle to create but the majority of exam papers are already scanned so that examiners can mark online. Sure, it might take a few technological leaps to link the exam boards together into one hub. But given that results from various exam boards pull through to UCAS using some form of tech-wizardry I am not willing to believe that the merely difficult is entirely impossible.
There are, however, several benefits to the system. First, it would enable employers to compare what students did in their exams – not just what they achieved. If employers don’t want to do this, then we should consider if their opinions about examinations are not more indicative of their wanting recruitment shortcuts than of a genuine struggle to recruit in a savvy manner. Second, if people were willing to share their records (I suspect many people would be), then we’d have an easily checkable record of past exams. Hence, when people start saying “Back in 1987 we had to rediscover relativity in 5 minutes without a calculator” and counterbalances this against the idea that today kids merely “decide if fish and chips are healthier than apples” we can call them out and ask for the proof.
Third, and this is potentially the most important. If students will share with their school, it gives teachers a plethora of options for spreading good examples of learning. At the moment it costs a fair bit of cash to get back exam papers. This means schools with healthy budgets can apply for lots and then use these to hone their teaching and share past mistakes with students. Poorer schools can’t. That shouldn’t be the way.
Given the exam is created by an individual, and the marking is paid for by the taxpayer, it seems right to me that the papers ought to go back to the individual as a matter of principle. That it would have additional benefits for employers and teachers to me makes it even more a right thing to do.
Related Posts:
What I learned From Writing About Exams
Forget Boycotts, Get Parents To Opt Out Of Exams For Four-Year Olds

The Vocational Education Trade-Off

Some opponents to vocational education suggest that tracking students into vocational pathways early in school life increases educational inequality. Are they correct? Yes. At least according to a new study by Bol & Van de Werfhorst.
What did they do to find this out?
Using the data for 29 countries (including the UK) the researchers scored each country on three things:

  • Level of ‘tracking’ (or setting) – This score included the age at which students are first selected academically/vocationally, what % of the curriculum was selected, and the number of different pathways
  • The level of ‘vocational enrollment’ – This score was based on the % of students doing vocational studies in upper secondary school, and
  • Level of vocational course ‘specificity’ – i.e. did courses include work experience, were they highly job-specific in their content etc. 

The researchers then looked for correlations between these three variables and of certain ‘outcomes’ which included:

  • Youth unemployment as a ratio of adult employment, and
  • Average length of job search
  • Inequality of PISA scores in the country
  • Educational attainment adjusted for social origin

What did they find?
The first three findings showed the benefits of vocational education
#1 – academic setting has no effect on the youth/adult employment ratio. Putting students with others of an equal ability doesn’t influence employment likelihood.
#2 – where vocational education is work specific it reduces youth unemployment. If you train people to do specific jobs, they go on and do them.

#3 – young people spend less time looking for jobs in countries with higher levels of vocational enrollment. So, the more vocational education available the less the amount of time young people spend casting about for work.

The next three findings show the problems of the academic selection that occurs in countries with strong systems of vocational education
#4 – In a more tracked (or ‘set’) educational system, where lower ability students are ‘tracked’ into vocational classes, variation in student performance across all subjects is more strongly based on social class background. 
#5 – Academic tracking enhances the importance of social origin for reading performance. I.e., in countries with setting, social origin increasingly appears related to reading performance. 

#6 – In more tracked (‘set’) educational systems, social background is a stronger determinant for an individual’s opportunities in school than in non-set 
HOWEVER – it is very important here to note that what appears to be causing the problem is setting and not the vocational element. The trade-off appears to be that when countries ‘academically select’ certain people to go down an academic route and give others a vocational one there is then an inequality.
HERE’S A THOUGHT: Why can’t academic and vocational studies both be available to anyone of any calibre? What would happen if you simply said that there were no entry requirements for any subject at GCSE, and that all subjects were GCSE/A-Levels, even if they were ‘vocational’ – what would happen then?
It seems to me that you would then get people selecting by interest, and getting the skills required for the workplace, but you wouldn’t get the downside of the inequality and social class tracking. Or am I missing something here?!

Did Gove Implement Comparable Outcomes?

Part of the furor over the English GCSE Fiasco last summer was the use of something called ‘Comparable Outcomes’ – a method through which Ofqual limits grade awarding by requiring that the number of grades given are comparable to the previous year’s cohorts or to exams the current cohort took when younger.
Much was made of the role ‘comparable outcomes’ in the lower-than-expected numbers of students getting English C Grades in August 2012. Gove, however, was adamant that the adoption of comparable outcomes had nothing to do with him. In fact, in an Education Select Committee evidence session about this issue he said:

Q4 Michael Gove: The first thing that I would say is that the comparable outcomes framework was something that was designed and adopted before this Government came to power. It was a previous Government that, under the QCA, as it then was, and then subsequently under the previous leadership of Ofqual, adopted comparable outcomes and outlined how it should work, first of all with respect to A levels and then with GCSEs. So the current team at Ofqual are dealing with tools that were designed by the last Government, rather than tools they have had a chance to fashion themselves.

Except, a recent FOI release from the DfE seems to contradict this fact. In a huge (and very delayed) release of correspondence between the DfE and Ofqual, one of the documents has a ‘comparable outcomes’ timeline and specification box on its last two pages. This is what the box shows:
gcse adoption of comp outcomes
It’s a bit tricky to make out but the fifth column is titled “Decision on use of comparable outcomes”. The decision to use it in new GCSEs in “most subjects”, the “new English suite, maths, ICT” and “New science suite” were all taken on 6th December 2010. After the current government came to power.
Now, I want to make clear – this does not mean Gove is to blame for comparable outcomes. In the 7 months between taking office and this decision it’s entirely probable that Gove didn’t know what was going on at Ofqual. After all, he was busy trying to push free schools and academies as quickly as possible. Secondly, the decision appears to have been made at an Ofqual board meeting on 6th December and it’s not clear how the results of these meetings were being fed back to the DfE if at all. Gove’s ignorance of the matter is therefore plausible, although to argue that the previous government adopted comparable outcomes is perhaps over-stating the fact.
However, before the policy was fully implemented in the 2011 exam season there was a report to Ministers about what was going on (although the date is unknown). Then, the FOI release shows how in the Summer of 2011 there were some issues that foreshadowed what would occur in 2012 – particularly in AQA’s History GCSE. So concerned is the tone in some of the released emails they speak of “letting teachers know” in advance that results are likely to be lower in order to start managing expectations.  It therefore seems likely that Gove knew before Summer 2012 there would be lower grades. And though he didn’t introduce comparable outcomes, given that the policy was only documented (and not even yet fully implemented) at the end of December 2010 was it really impossible before Summer 2011 – or even after it – to stop the policy if one were really against it? Or is it possible that the ‘crisis of confidence’ that one email refers to as a possible consequence of comparable outcomes was, in fact, a great way to push for a new exam system?
The many many FOI releases are here if you want to look for yourself in order to come to a conclusion. For what it’s worth I don’t think there’s any evidence that Gove acted improperly during the GCSE Fiasco, but I do object to him blaming comparable outcomes on the previous administration when its implementation happened under his watch.

Freedom of Information Request regarding GCSE English Coursework Marks

I recently put in a Freedom of Information Request for the distribution of marks in the 2012 GCSE English Controlled Assessments. I did this because I read the @deevybee blog on the Phonics Test data with some interest and wondered if the GCSE English marks had shown a similar pattern.
Unfortunately Ofqual have turned down the request. Their reason is shown in a letter here (and below)
The main reason seems to be that they have the information as part of the Ofqual investigation into the GCSE English fiasco and therefore cannot currently release the information.  There also appears to be an undercurrent of ‘it wouldn’t be in the public interest’..
I have written back asking them if the information will be disclosable once the investigation is complete.
UPDATE: 16/10/12 Ofqual have said that once the English GCSE Investigation is complete and the report is published then I can discuss with the statisticians the possibility of releasing controlled assessment information.  They also said the Ofqual report is due to be released before the end of October. My bet is on a half-term release date.
[slideshare id=14705499&w=479&h=511&sc=no]

Ofqual: We will publish all correspondence. We have nothing to hide. …(Really?)

On 11th September 2012, during the Education Select Committee meeting convened to take evidence from key players in the GCSE English ‘Fiasco’, three members of Ofqual were present and accounted for their role in the situation.
During discussion Glenys Stacey, Ofqual’s CEO, was directly asked by Pat Glass MP:

Pat Glass MP: We accept that there is no phone call between you and the Secretary of State, but if, as a Committee, we decide to extend this Inquiry—and I think we should—would you be prepared to publish copies of  correspondence, emails, text messages and  phone callsbetween your staff who are involved in this and senior staff at the DfE who are involved in this and special advisers, ministers’ spads?
Glenys Stacey: Absolutely; we have nothing to hide

However, an FOI request by Antony Carpen asked for correspondence from October 2011 onwards regarding the English GCSE boundaries and specifically asked that: “Correspondence should include but not be restricted to letters, emails, notes of phone calls and minutes of meetings”.
What Antony Carpen received was this: A document providing *some* emails between the DfE and Ofqual but only on the day of the results release. And, though they are interesting (not least because the DfE had to ask some reasonably naive questions in the midst of this crisis), they do not provide the information asked for by Mr. Carpen.  It is tricky to understand why this information is missing. The legal explanation beforehand explains why names have been redacted, especially in the case of junior members of Ofqual and the difficult nature of the case. This is absolutely reasonable and fair. What is not clear is why more information has not been included, specifically notes of meetings and phone calls and of events leading up to the publication of results.
Glenys Stacey clearly said all documents were publishable because there was “nothing to hide”.  If they are not now released, one can only wonder what that means for the validity of her statement.