When evidence doesn’t work

There is a particular type with whom it is very difficult to discuss education. I will call him “rational man”. Often a science teacher, rational man recoils on hearing any strong opinion and instantly demands, “What evidence do you have to back that up?” When challenged on his beliefs, rational man presents himself as a superior form – the unideological pragmatist – and declares, “I believe in evidence. I believe in what works”. To rational man, I would like to say, evidence does not always work.

On Saturday, I attended ResearchED Conference at Dulwich College. Organised in six months and inspired by a Twitter conversation, it was a remarkable event and by far the most energetic education conference I have attended. However, throughout the day I detected a worrying tendency encapsulated in the conference tag-line “working out what works”. Some education research enthusiasts seem to believe that evidence can overcome opinion. It cannot. It can inform opinion. It can promote or discredit opinion. But evidence will never be opinion.

Fine surroundings for ResearchED

Fine surroundings for ResearchED

I would like to qualify that I am not an anti-enlightenment dinosaur. I like evidence. I respect evidence. I sometimes even use evidence in my articles. Tom Bennett, who put together ResearchED, is entirely right that better use of evidence is needed to show up bogus educational fads such as VAK, learning styles and Brain Gym©. However, evidence is limited. Perhaps unfashionably, I believe that some of the most important debates in education will never be solved by evidence.

Firstly, the measurable outcomes in education are complex. The physician Ben Goldacre, one of the ResearchED organisers, is evangelical about applying Randomised Controlled Trials (RCTs) to educational research. For RCTs in medicine, normally used to trial drugs, there is a clear outcome: either the patient gets better or they do not. Such clear outcomes can exist in education. Debates over how to teach literacy were greatly aided by the 2005 research of Joyce and Watson, which demonstrated the superiority of Systematised Synthetic Phonics over other primary reading programmes. This worked because reading and spelling levels are very easy outcomes to measure. But what about deciding on the outcome in the first place? Can an RTC tell us, for example, whether secondary school pupils benefit from studying Shakespeare? For us in education, the outcome is rarely clear. In fact, the outcome is the subject of our most impassioned debates.

School improvement can be broadly measured by exam performance, but this can lead to a bleakly instrumentalist view of schooling. Exam results are important, but they are not the sole purpose of school. At a talk delivered by Professor Coe of Durham University, I was shown the slide below. The graph takes the Education Endowment Foundation (EEF)’s meta-analysis of 33 educational ‘tools’, and plots the average cost of each ‘tool’ against evidence that it aids academic attainment. Such evidence classifies ‘Behaviour interventions’ as ‘may be worth it’, and ‘after school programmes’ as ‘not worth it’.

How do you define 'worth it'?

How do you define ‘worth it’?

Has it occurred to the EEF that some aspects of school life may be ‘worth it’, even if they do not improve academic outcome? After school activities can enrich a pupil’s life, uncover talents, and build character – a principle to which schools such as Dulwich College, our venue, hold dear. These outcomes are no less important because they are hard to measure. Or take behaviour. I believe that schools should inculcate in pupils the importance of habits such as: not swearing at adults; not bullying your peers; putting litter in bins; and holding open doors. I do not think that pupils should acquire such habits because they will help them in their GCSEs. They should acquire such habits because they are normative goods.

Rational man believes that they can make their way in the world without recourse to the murky business of ideology and morality, or to use a more contemporary term, ‘values’. However, we can never dispense with values. If you do not agree, consider this hypothetical case. The Institute of Education runs a RTC across 200 schools into the use of caning, and conclusively proved that caning in schools improves exam performance. Would you be convinced of its merits? My hope is that you would not.

What if we worked out that the cane works?

What if we worked out that the cane works?

The caning example may seem hysterical, but it is not unusual. In order to resolve many core debates in education, we first have to decide upon what we value. Should schools predominantly teach British history? Should pupils memorise poems? Should the school day begin with an act of collective worship? Should pupils wear blazers and ties? This is why it was such a breath of fresh air to hear Frank Furedi introduce himself as the “antichrist at the last supper”. He declared that ‘scientism’ is endemic in education, and rightly observed that it is almost unheard of in education debates for people to appeal to the “evaluative, normative language of ethics or morality”. I would not go so far as Furedi to say that I am an ‘educational pluralist’, but I would agree with his claim that much education research is too often an unnecessary distraction.

Beware of the 'Rationalist Delusion'

Beware of the ‘Rationalist Delusion’

It was terrific to hear former Gove adviser, Teach First employee and all round brainbox Sam Freedman refer to the psychologists Jonathan Haidt and Daniel Kahneman in his talk on “Evidence-based Policy Making”. Both psychologists have written about the dominance of the subconscious, emotional part of our minds, over the logical, conscious part. In his book The Righteous Mind, Haidt writes of the “rationalist delusion” that human beings can be swayed by logic alone, concluding that the first principle of moral psychology is that “intuitions come first, strategic reasoning second”. Similarly, Nobel Peace prize winner Kahneman has spent a large part of his career building evidence to show that evidence, paradoxically, rarely influences personal judgement. One of the most sensible things that I heard all day was from an audience member during Freedman’s talk. The lady said, “I don’t want evidence based policy. I want evidence informed policy. Basic values are important.”

ResearchED was a fantastic, thought provoking event and the calibre of discussion and presentations was remarkable. However, I was occasionally alarmed by cases of evidence-zealotry displayed by those present. Prior to working out what works, we need to work out what we want. To solve those questions, we cannot abandon our human faculties of judgement for the false Gods of educational research. As David Hume wrote in 1739, “reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them.” Hume had no evidence to make such a statement. He just happened to be right.

Advertisements

~ by goodbyemisterhunter on September 9, 2013.

12 Responses to “When evidence doesn’t work”

  1. Reblogged this on The Echo Chamber.

  2. […] Matthew Hunter: When Evidence Doesn’t Work […]

  3. I like the post. One small point: to be fair to Coe and the EEF, he did make your point during his talk: that there may be various other considerations surrounding decisions such as what size to make classes, besides just educational attainment. My impression was that their philosophy is that be you evidence-based or evidence-informed, you need the evidence, and that is what they provide.

  4. I like this post too. In fact, I don’t think there is anything here I disagree with.

    I certainly agree that we can’t ask ‘what works’ until we are clear (and ideally agree) how we define ‘works’. What is the purpose of education? What outcomes do we value? How would we define what counts as success or what is effective? In fact, we have to be able to more than just define it, we have to be able to operationalise it: to measure it in a way that is valid and acceptable. And it is undoubtedly true that much that is really important in education cannot be measured in this way.

    But I also think that a bad mistake we could make would be to overemphasise the amount of disagreement or complexity here. There are some core educational aims (eg being able to read) that are uncontroversial and can be satisfactorily measured. If there are real choices about different ways we might achieve them, shouldn’t we be guided by the best evidence about which ways are most likely to do this? Of course there may be a danger of falling into ‘scientism’ here, but the alternative seems much more dangerous to me.

    You ask “How do you define ‘worth it’?” in relation to my slide. I should have made it clearer in my talk that for the purposes of that slide (and the EEF Toolkit generally), ‘worth it’ means ‘likely to be a good choice for raising attainment in basic skills’. EEF has been pretty clear (I think) that the Toolkit is limited to these outcomes. Whether that is the right decision is of course a value judgment and should be debated. I think it is broadly right, but I also think we have to be careful not to shut down the complexities that arise here. For example, the evidence shows (pretty convincingly I think) that reducing class size has only a small effect on learning. But it may have much bigger effects on other things (teacher stress, quality of social interactions, parents’ satisfaction, etc), and these could make it worth doing overall. My view is that we should ulitmately try to quantify these benefits too, and then have a rational discussion about how different choices optimise what we value most.

    One other point I’d like to make is prompted by your caning example. An RCT is the best method we have to estimate the likely consequences of making specific choices. But even when it is appropriate, it isn’t perfect and there are plenty of situations when an RCT is not appropriate. One condition, for example, is that we should evaluate a ‘strategy’ only if it is a choice that we are genuinely considering adopting. If something is morally repugnant then there is absolutely no point in evaluating it: we wouldn’t do it, however ‘effective’ it was.

  5. the beauty of evidence is you don’t have to accept it but in reasoning your objections you are drawn you into useful thinking about the problem

  6. […] afraid I still believe that would be a good thing to know (though obviously not do). As Matthew Hunter has said before, education research isn’t going to tell us what to value. But if we know that the quickest way to learning is through violence, then we must face that and […]

  7. You may change your mind about ‘the superiority of systematised synthetic phonics over other Primary reading programmes’ when you have read this paper about the Clackmannanshire research..
    http://t.co/QdYVPTopuY

  8. […] this is a valid outcome, or one that is shared by all practitioners, is explored by Robert Peal here, but for the purposes of this post I’m going to assume that everyone is on board with […]

  9. What works for you, may not work for me.

    What works for me, may not work for you.

    Every school is unique. Every single one.

  10. […] have written about the problems with such a stance in my post ‘When evidence doesn’t work’. Firstly, some of the key debates in education are based on value judgements, not efficacy. What […]

  11. […] refers back to a previous post entitled ‘When evidence doesn’t work’ summarising several sessions at the ResearchED conference held at Dulwich College last year. He […]

  12. […] a question such as this one exposes the limits of evidence. Robert Peal has written eloquently about the way in which evidence can inform opinion, but it cannot replace it. Latin is a case in […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: