Design a site like this with
Get started

Calculated Grades

Calculated Grades

The problem of standardisation when the only standard is the Leaving Certificate?

After a week of media and political speculation, on Friday 8th of May the Minister for Education Joe Mc Hugh announced the Department of Education’s answer to the conundrum of holding the Leaving Certificate in Ireland in the midst of Covid 19. This answer is known as a ‘calculated’ grade.

In order to calculate the grade, there will be two phases, one school based phase and one national phase.

During the school based phase, the school is being asked to:

1.       Get teachers to generate a percentage mark for each pupil in their classes (grade estimation)

2.       Get individual teachers to ‘rank’ pupils in their Leaving Certificate classes

3.       They complete an alignment phase with other teachers in their subject area (moderation)

4.       Principals will look at the whole school cohort and sign off on grades.

In the national phase, the Department of Education will:

1.       Apply standardisation data to the data that has arrived from schools (using a State Examination Commission bell curve)

2.       To do this, it will use a school’s profile (data from previous 3 years Leaving Certificate grades) to calculate the grade of the student in each subject.

I have a number of issues with this which I would like to outline. I’m going to look at these in the two separate phases.

School based phase

I fully trust teachers in Ireland to have very detailed conversations over the coming weeks and to do their upmost for their students. They will think carefully and ensure that the estimated grade that they pass on to their principals is the nearest to what the teacher thinks the student would have got as they can prove or have evidence for. I would also like to add, that we still have yet to receive instructions of exactly how this phase will work, but from the details we now have, it is clear that the Department is expecting teachers to generate grades.

In its document (A Guide to Calculated Grades), it says that teachers could use records of classwork and homework to justify their grade. Classwork and homework are not standardised. They are useful to teachers when they are developing a picture about particular students, but they are not real data. The Department has asked us to take into consideration pupils’ performance in Christmas and summer exams. Even within schools in Ireland, the same tests are not necessarily used in the same subject area (though this is normally considered good practice), so even within a subject area, there can be large discrepancies in what the ‘in-house’ examinations look like. They do not need to replicate the Leaving Certificate in content or format if the teacher does not think it is appropriate. Teachers rarely ‘swap’ exams to moderate to make sure that the whole year group is marked evenly. In many schools, there will be teachers having serious difficulty in coming up with a system whereby the ‘ranking’ of students in individual subject areas will be in any way transparent. Even the mock examinations are not standardised. They are certainly closer to the standard than ‘in-house’ tests but they are often leaked (because there are only a few companies making them) and different schools hold exams on different days so students have often seen them in advance. So teachers will be up against it in coming up with an exact percentage and crucially, having the evidence to justify it. This will be essential, if appeals of the grades are to be meaningful.

Schools also have their own ways of keeping records of this data, which will be very patchy simply because never ever before has it been used to do anything other than to inform parents, in a fairly general way, about how their child is getting on at school. Fee paying schools will probably have very good, well kept evidence of grades achieved in mocks and other in-house exam style tests because they will have been sharing them with parents over the two-year course through detailed reports and parent teacher meetings. This naturally disadvantages the (already disadvantaged) school, because they simply won’t be able to prove that certain pupils were bucking the trends and achieving well. They may not have adequate records to back it up. Many of their pupils will have poor attendance so have been absent for large chunks of the year or missed their mock exams. There will be many teachers scrambling around for some evidence now of what pupil can do but finding themselves coming up with very little. I include myself in this category. I joined my school last September so I will be relying somewhat, on data from a person whom I have never met, who taught my pupils last year and set very different style in house exams to me.

So, the overall data picture of exact attainment at schools in Ireland is very patchy and will vary widely from school to school. I am not saying that this is necessarily a bad thing by the way; too much data does not necessarily lead to better outcomes for students. The main reason for this lack of standardised data is because the Leaving Certificate is the standard. Any other retrospective standard that is applied, is always going to be imperfect and unfair compared to what should have happened. The retrospective aspect here is absolutely key because when pupils were sitting in lessons, doing (or not doing) classwork or homework, they (and their teachers) never, ever thought that this information would be used to generate a mark for them or to decide whether or not they were going to get into college.

National phase

I also have reservations about how the national phase of this will operate. I am not a maths teacher and I am certainly not a statistician. However, I have marked for the State Examinations Commission and I understand (roughly) how Leaving Certificate papers are marked. It is a rigorous process with a lot of integrity that, and this is key, produces similar results year in year out.

The papers are distributed to markers with a marking scheme. A marker might mark two or three hundred scripts from different schools. They will not be given entire school bundles as these packs of papers will be divided before distribution. The individual papers are then jumbled thoroughly at home by markers before marking begins, and from this a random sample of candidates is marked by each marker using the marking scheme and results are returned centrally. Then the marking scheme is adjusted. Then everyone marks their sample again. Then the results are analysed again and the marking scheme is adjusted yet again and so on. These tweaks in marks, or in the number of marks assigned to each question, have a big impact of on pupils who are at the margins; teetering between one grade and another. It can take several tries but the State Examination Commission adjusts the marking scheme to get the results it wants: namely the bell curve. Crucially, these results are always, roughly in line with previous years. It means that you have large numbers of students achieving in an average (in the middle of the curve) and only a few achieving very high or very low marks. This is also how the SEC is able to account for ‘harder’ papers. They simply give the ‘easier’ questions more weight and adjust accordingly. Whether or not you agree with the use of a bell curve for assessment purposes, it is what is used in the Leaving Certificate.

What the Department of Education has decided to do this year, is to retrospectively apply this national bell curve using the teachers’ estimation (mentioned above in the school phase), instead of the other way around. In order to take into account the differences between attainment between schools, they are going to use data on school profiling. I understand their reasons for doing this (they lack any other data to help them), but it is nonetheless extremely problematic.

Let’s take two (fairly extreme) examples. X is a fee-paying school. Their 100 Leaving Cert students are well off and on the whole do a good deal better than the national average in all subjects. They regularly send pupils to university and third level colleges. Very few of their students fail. Y on the other hand, is a DEIS school. Their 100 students are already disadvantaged for a large number of reasons that I cannot go into at length here (socio-economic deprivation, lack of cultural capital and so on). Students from this school are likely to have pupils year on year, that do worse than the national average in all subjects. They have quite a few pupils who fail. It is extremely unlikely that many will go to university though, importantly, some will.

School profiling takes into account the each schools’ context because they clearly have completely different cohorts; completely different starting points if you like. I can understand that in the absence of anything else, it is important to take these differences into account. The Department of Education cannot simply ignore the schools’ context. The nature of the school context is taken into consideration in the following way: it takes the data from the Leaving Certificate results of the last three years at school X and school Y and applies it to the estimated grades that the school has supplied. This then feeds into the overall national bell curve figures. Obviously though, this is the reverse of what normally happens.

However, the crux of the problem is that the cohorts in each school is small, so applying the curve retrospectively means that any pupil in school Y who was going to defy the odds and score top grades, is much less likely to do so 2020 than if they had had the chance to sit the exam. Some proponents of school profiling say ‘well it’s fair to most pupils’ but a key factor is that it is not fair to all pupils, especially to those who are in any way bucking the trend in a system of schooling that allows school X and school Y to exist alongside each other. In other words, it is most unfair to pupils who are already disadvantaged anyway. It is not only perpetuating inequalities that exist in education, it is compounding them.

I am very concerned about the future of the Leaving Certificate in light of the problems the Covid 19 pandemic has thrown up. I am very worried that this crisis will lead to us creating an educational data machine akin to what I used to work in England. This is the ultimate nightmare scenario where teachers lose autonomy and spend huge amounts of time assessing pupils and managing data. This is time that is not spent on planning and delivering good lessons which is the key to improving educational outcomes for all. We will certainly be in a bind about this one, because if this crisis points to anything, it points to a complete lack of educational and statistical data that we need to be able to replace the Leaving Certificate with anything meaningful. The reason they were able to use predicted or estimated grades in the England and Wales at a days’ notice (even though these too will be massively flawed) is because they assess children to kingdom come and every piece of work they do in school is tracked and traced so they have evidence for whatever they decide to estimate or calculate.

The Leaving Certificate is a high stakes assessment and an awful lot depends on the outcome. To my mind, it is the only leveller in education in Ireland. When you take it away, you are left with this imperfect mess which has absolutely no integrity.  

Follow My Blog

Get new content delivered directly to your inbox.