exam results

Joined
1 Dec 2006
Messages
6,233
Reaction score
741
Location
Cumbria
Country
United Kingdom
It's a long long time since I was at school but when you took exams the answers were either correct or wrong. How can they adjust results of A levels this year following the students exams, surely they have got the answers to questions either right or wrong ( in most subjects anyway ). Seems a bit unfair when the media seems to be suggesting that results will be manipulated downward.
 
Sponsored Links
It's a long long time since I was at school but when you took exams the answers were either correct or wrong. How can they adjust results of A levels this year following the students exams, surely they have got the answers to questions either right or wrong ( in most subjects anyway ). Seems a bit unfair when the media seems to be suggesting that results will be manipulated downward.
It's a long time since you were at school, and back then, they arrived at the results in exactly the same way as today. I'll explain if you want.
 
It's a long time since you were at school, and back then, they arrived at the results in exactly the same way as today. I'll explain if you want.
Yes please, if you reached the right number of answers then you passed, if you did not then you failed. As far as I know it was as simple as that. I know you could get different grades but that depended on how many questions you answered correctly.
 
The way exam results are manipulated rather than relying on an individual's actual performance is antiquated...

Many other countries have a better approach.
For example in my partner's country if you get 93% you get a 9.3
If you get 56% you get a 5.6
And it matters not how many students get a good or bad score in any particular year.

It's better for employers and further education as it's a simple and precise system!
 
Sponsored Links
There has been a change but I don't know how. Years ago because they thought it wasn't possible to set dead constant exam papers they did it on a % basis, A set % would be expected to pass. That gave them a basis to work out grades. Getting the top grade needed something extra, Generally thought to be down to how the question was answered, What they do now - pass and things changed rather a long time ago.

One of my lecturers was an educationist involved in this area. His view was interesting. If anything they thought average levels of intellegence were dropping but no way of being certain so little more than a feeling. He also mentioned something that became apparent from technical colleges. An odd one - people completing courses that there should be no need for as they should have achieved better results at school.
 
Yes please, if you reached the right number of answers then you passed, if you did not then you failed. As far as I know it was as simple as that.
It was never that. What happens is that the exam is set by a panel (think Maths, cos that can be seen as 'right/wrong'), and a mark scheme is developed to match it so that the army of markers are consistent. So far so good. Let's assume a paper is out of 100. In your memory, 50% is 'a pass', less is a fail. Not how it works.

All the papers are marked, and let's say raw scores are low across the country. That says, 'Exam too hard', not 'kids didn't learn enough'. From there grade boundaries are set, and since the paper was too hard, an A grade might need 65 out of a hundred. If it appeared from the raw results that the exam was 'too easy', then an A grade might need 85 out of a hundred.

Grade boundaries are set AFTER the exams are marked. Not to decide what grades 'to give out'; but to decide what grades 'have been earned'.

This method is called 'norm referencing'. When I was training in the RAF, many of my flying training tests were pass/fail questions with a fixed pass mark. That's called 'criterion referencing'.

O level, A level and GCSE have never been criterion referenced.
 
The way exam results are manipulated rather than relying on an individual's actual performance is antiquated...

Many other countries have a better approach.
For example in my partner's country if you get 93% you get a 9.3
If you get 56% you get a 5.6
And it matters not how many students get a good or bad score in any particular year.

It's better for employers and further education as it's a simple and precise system!
That fails immediately when an annual exam is set too hard or too easy. In a hard year, the top score might be 72%, so 7.2. In an easy year that kid might have got an 8.6. Not precise at all.
 
It was never that. What happens is that the exam is set by a panel (think Maths, cos that can be seen as 'right/wrong'), and a mark scheme is developed to match it so that the army of markers are consistent. So far so good. Let's assume a paper is out of 100. In your memory, 50% is 'a pass', less is a fail. Not how it works.

All the papers are marked, and let's say raw scores are low across the country. That says, 'Exam too hard', not 'kids didn't learn enough'. From there grade boundaries are set, and since the paper was too hard, an A grade might need 65 out of a hundred. If it appeared from the raw results that the exam was 'too easy', then an A grade might need 85 out of a hundred.

Grade boundaries are set AFTER the exams are marked. Not to decide what grades 'to give out'; but to decide what grades 'have been earned'.

This method is called 'norm referencing'. When I was training in the RAF, many of my flying training tests were pass/fail questions with a fixed pass mark. That's called 'criterion referencing'.

O level, A level and GCSE have never been criterion referenced.
That sounds like a way of doing but it still sounds a bit unfair announcing to the press that results are being held down.
 
It's a long long time since I was at school but when you took exams the answers were either correct or wrong. How can they adjust results of A levels this year following the students exams, surely they have got the answers to questions either right or wrong ( in most subjects anyway ). Seems a bit unfair when the media seems to be suggesting that results will be manipulated downward.
The simple answer is that most academic questions don't have right or wrong answers, what they have is a range of answers that vary in detail. It's the range that exam boards change.

It takes me back to uni. I was a mature student who had been working in the building industry so, naturally, I thought I knew everything. First test comes along and I get something like 90% - it was simple, easy stuff. A girl in my tutor group who knew diddly squat about building got something like 93%. When comparing answers it turned out the she - coming from school and being used to A levels etc. - described everything in detail, whereas I assumed that some things were so obvious that they didn't need describing. One of the things I remember was a brick - she described what a brick was and it got her an extra mark. Huge lesson early on. I didn't make the same mistake again. Incidentally, the purpose of the test was actually meaningless in the scheme of things and the real purpose was to do exactly what it did with me - to reinforce the process of answering questions fully.
 
That fails immediately when an annual exam is set too hard or too easy. In a hard year, the top score might be 72%, so 7.2. In an easy year that kid might have got an 8.6. Not precise at all.
1. Not if everyone takes the same exam, not ones from different exam boards.
2. Results are relative, so if an exam is harder one year than another then you don't get 'grade inflation'.
Could you explain why nowadays A*'s and A's are often needed to get into Uni courses when back in my day B's and C's were often sufficient.
Were exams harder then or students 'thicker'? Could you tell us why to prior to the 'virus' there were year on year grade increases for the best part of 20 years, and whether year on year students during that period appeared to be 'brighter' than their predecessors?

3. If all results for all years are available to anyone concerned (as they are in my example), then the exam results are easy to quantify.
4. Most of the countries above the UK in the international education lists have a similar simpler system.
 
Question:
"How many nails go in the building of a house"

Answer
"It depends whether you wear protective gloves".

Teacher I know reckons the covid-buggered years were marked far too high for what the kids actually knew. The estimates were based on the best they might have achieved if they'd done a load of work, where many actually did nothing much.
This last 2 year cohort, a lot of her students have psychological problems she's sure are as a result of the earlier disturbances, maybe world events too.. Lots of doom and gloom - depression, inability to concentrate maladaptive behaviour etc.
Do you deal with that, in terms of adjusting marks?? I don't know whether it would be right or not.
 
1. Not if everyone takes the same exam, not ones from different exam boards.
Year after year? Not possible.
2. Results are relative, so if an exam is harder one year than another then you don't get 'grade inflation'.
Not under your system - they'd be absolute if you give them 7.2 or whatever.
Could you explain why nowadays A*'s and A's are often needed to get into Uni courses when back in my day B's and C's were often sufficient.
Were exams harder then or students 'thicker'?
Kids work harder, so standards have gone up. Look at some past papers. The papers got easier for a while (eg they took maths out of science) but at some point several years ago they started getting much harder. There's a lot more in them now than there was in my day. I didn't work very hard at all.....
3. If all results for all years are available to anyone concerned (as they are in my example), then the exam results are easy to quantify.
What, by prospective employers?? No way. They just want a meaningful grade.
4. Most of the countries above the UK in the international education lists have a similar simpler system.
Which, how, and how do you know?
The Korean system is simple - work every waking hour.

It was all those terrible immigrants , from Asian & some others who pushed the standards up. All their parents wanting them to be doctors and accountants and lawyers.
Some other countries' parents are more keen on a balance/laid back. You can generalise, it shows in the results.
 
Last edited:
It makes sense for the grades to be according to percentiles. It’s a great way of normalising the results and ironing out poorly written papers or scoring methods.

My son got his grades back and decided he didn’t want to accept his offer. A quick ring around and he’s had no problem getting in to where he wants.

Foundation years appear to be a bit of a scam to charge another 9k. But most seem to be happy to negotiate away the need if they have places on their courses.

I can see the benefit of a placement year. But doing a foundation year to get a better chance of a 2:1 or 1st seems odd.
 
Sponsored Links
Back
Top