I was very 50/50 about whether or not to join the Chartered College. I’d read the blogs and watched the inevitable Twitter flame wars and wasn’t convinced. But it was right at a time when I thought teacher professionalism was on its way out and the offer of access to research papers was very tempting. In the end, my ego got the better of me. The idea of being able to refer to myself as a “Chartered Teacher” or even a “Master Teacher” just sounded too exquisite to pass up. So I joined.

I couldn’t go to the big London launch because we had a newborn knocking around and it just wasn’t the right time. I would’ve loved the chance to meet some of my tweeps and to hear Rob Coe speak, but ended up pretty grateful I didn’t have to massage any strangers or put on my karaoke best. So I’m a member, but a sceptical one.

I was quite excited to receive the “interim” edition of Impact, the College’s journal/magazine. Straight out the pack it looked and felt absolutely gorgeous; whoever did the graphics, typeset and general aesthetics has done a really excellent job. I also reckon I belong to quite a large section of their potential readership: frontline teachers with a dozen classes, rammed timetables and a frantic work/life balance but who are interested in improving their practice and engaging with the research. So I had a little read and below are a few of my thoughts.

This is no “prog” takeover

When the CC was first floated, there were those in the edutwittersphere who were concerned that it would just become another haven for progressive educationalists, academics, pedagogues and consultants. Whether or not that was a valid concern at the time, this magazine does not indicate that such a takeover has occurred. The articles were all balanced, scholarly and rigorous to these untrained eyes. Certainly none of them pushed any of the all-too-familiar tropes that we are used to. There were a couple of articles that flirted with some ideas that I wouldn’t naturally identify with but I wouldn’t expect it to just promote ideas which I agree with and there was nothing too in your face or evangelical. There were also a number of positive references to ResearchEd. All this, to my mind, is a good thing.


There are 18 articles in the magazine, with 11 of them addressing the general question of research’s interactions with schools and the final 7 being specific cases from the research. The first 11 were supposed to be split into two groups but I didn’t really see the distinction between them and would probably lump them together. I found the final section the most interesting as it dealt with day-to-day questions like autism awareness, the testing effect and urban legends in education.

However, I did feel that some of the language and discussion was overcomplicating. We are told of:

  1. Evidence informed education
  2. Evidence based education
  3. Evidence enriched education
  4. Implementation intention theory
  5. Evidence-rich collaborative enquiry
  6. Research engagement
  7. Knowledge-based communities
  8. Knowledge-based reasoning
  9. Research-literate teachers
  10. Research capital
  11. Actively inquiring schools
  12. Cycles of inquiry
  13. Research learning communities
  14. Networked learning communities
  15. Professional learning communities

And so on. This is probably a problem with me not the journal, but as an ordinary punter I just felt like there was a lot of over-complicating language to address what, to my mind, are fairly simple questions: how do teachers get hold of solid and reliable ideas and transform them into classroom practice? Can you give me some examples of research which can influence what I do in my classroom?

The Fundamental Question

The first 11 articles all dealt with the same question. We have research in a number of different forms: how do we turn that into classroom practice and how do we assess whether or not it has worked?

The problem was posed best in James Mannion’s article, where he drew on the example of the EEF’s toolkit entry for “feedback”, which states an impact of +8 months. He uses various meta-analyses to show that a third of research entries on feedback actually show that feedback interventions have a negative impact. And yet in schools across the country there is no acknowledgment of this fact as SLT members seek to use the headline figure to justify whatever policy they happened to have dreamed up that morning.

Unfortunately, I have major issues with the solution that he, and many others in the journal, propose. They argue that groups of teachers should take the research and develop systems to implement a program and evaluate if it works in their school. I just don’t see how that makes any sense or can be reliable at all. In fact to me it falls foul of Rob Coe himself’s instructions as to how to make it look like your improvement project has worked. Schools are the least controlled environments in the world and there are a huge number of confounding variables operating every second of every day.

A friend of mine on Facebook recently pointed to some evidence that starting the school day later had an impact on student attendance and punctuality in schools in some state over the pond. Without getting into the nitty gritty (it’s complicated) I pointed out to him that if I were a school leader and was implementing a late start policy with the aim of improving punctuality and attendance it would be accompanied by whole-school assemblies about the importance of the above, tutor talks re-emphasising that message, some system of rewards and punishment, parental engagement etc etc. So what was it that caused the positive outcomes? Who knows? Could have been anything.

I don’t really have a good answer to the question. Probably if forced I would say that the research (including RCTs, whose praises are sung by Sir Kevan Collins in the magazine) gives us a best bet – what is it that will work most of the time for most of our students? Any more than that strikes me as guesswork.

Perhaps I’m wrong. Perhaps someone will do some meta research which shows whether or not school evaluations of their research based interventions are actually accurate. But I’m not holding my breath.