Search

A Chemical Orthodoxy

Schools, Science and Education

Search results

"curriculum"

Thinking Curriculum: The One Stop Shop

Thinking deeply about curriculum is new to most of us. For a long time, we’ve focussed a lot more on the how than we have on the what. Recent changes in mood have been revelatory to me and, I imagine, many others. Perhaps ironically though, most of us who are now interested in curriculum, didn’t follow a formal curriculum when learning more about curriculum. As such, and I’m happy to only speak for myself here, my knowledge came in drips and drabs, bits and pieces and stops and starts. That’s probably just the nature of the beast.

I was asked by school to deliver some training on curriculum, and argued that the thing that would be most useful would be to introduce staff to some of the key terms that are bandied around when thinking about curriculum. Familiarity with these concepts isn’t just important in and of itself, but is crucial for deepening and enriching curricular thought. Rolling these ideas around in your head forces you to contemplate your subject in a slightly different light and spurs you on to delving deeper into what it all means.

I remembered Ruth Walker’s marvelous blog from the Curriculum In Science symposium. In it, she introduced us to a number of terms and how they might find application in science. Essentially, I wanted to copy and build on her ideas based on some of the stuff I’ve been reading over the past couple of years. A document started to take shape, which I’ve called The One Stop Shop. It contains 22 curricular terms that I have personally found very useful in shaping my thinking about curriculum. Each term has a definition, a couple of examples from a range of subjects, some thoughts about how the term is useful to teachers on a day-to-day basis, and a section with further thoughts and extra reading.

I’ve also put together a short slideshow which just highlights a few of the highest leverage concepts. The slideshow is designed to be an accompaniment to the main document and used when introducing people thinking about curriculum.

At the end of the slideshow, there is a slide with some provisos for use, which I wanted to reproduce here:

  • This is not for Ofsted, this is because it is important
  • My list is my interpretation, it is not objective fact
  • “One stop shop” notwithstanding, there is much more out there!

Please do bear all that in mind, and if you have any questions or contentions do not hesitate to be in touch. As ever, enormous amounts of gratitude to the wonderful thinkers and writers (most of whom are linked in The One Stop Shop) who have been so pioneering and inspiring. Enjoy!

Thinking Curriculum – The One Stop Shop v2

Thinking Curriculum – The One Stop Shop – powerpoint

Curriculum don’ts

Everyone’s talking about curriculum these days, and this is a Good Thing. As I’ve argued before, we’ve spent too long talking about generic “teaching and learning” or “pedagogy,” without the realisation that content must precede delivery. There is no point talking about how without first establishing why.

However, there are two problems with the discourse as it is playing out: problems that are distinct, but also highly related. The first is that a lot of the change is coming from Ofsted’s new Education Inspection Framework. This isn’t a problem intrinsically, but it’s no state secret that for a long time schools and the CPD industries that spring up around them have been using fear of Ofsted to drive changes, and often bad ones. Sometimes the fear was legitimate, sometimes it wasn’t. But institutional memories live long and we are now in a situation where people are starting to think about their curriculum through that narrow lens of “what do Ofsted want?” rather than “what is the right thing for my school?”

The second problem is our lack of training and vocabulary. There is a wealth of deep curricular thinking out there and though it might have always been a minor thread in teacher education and training, it’s out there, and making a resurgence. But when it comes to implementing changes (in the aforementioned shadow of Ofsted) a lot of the time we lack the conceptual tools not only to express and communicate our ideas but to actually shape them.

Unfortunately, it already seems like people are making mistakes when it comes to curriculum and responding to Ofsted. Following Claire Stoneman’s wonderful blog on curriculum clangers here, I wanted to expand on her list based on some things I have seen, with headlines in the image and explanations below. Buckle up.

Curriculum Don’ts

1. No statements of curriculum intent

I’ve now seen a couple of dozen people online looking for help writing statements of curriculum intent. This is not the point. Michael Tidd’s excellent article argues that any such statement is likely to be vacuous and banal. This is because the intent of your curriculum resides to a much greater extent in your schemes of work than it does in any vague and platitudinous opening statement. See also comments like:

Where I think there is a bit of nuance here is thinking in directions like this:

A response to that of course is that schools that haven’t been thinking about curriculum or paying attention to the changes in the educational zeitgeist are unlikely to be able to think particularly deeply or with much sophistication about their intent. It might seem logical to think about intent before you start planning, but I think it’s more reasonable to actually go out and read stuff on curriculum first. If you think “oh ok I’ve written a statement about my curriculum intent: box ticked, job done!” you are wrong. You need to go read stuff and build the necessary conceptual frameworks to have meaningful conversations about intent. Once you’ve done that, if you really feel like a document of intent will help you then go for it. But if you’re doing it because something something Ofsted, you should stop.

Oh, and if the above wasn’t enough Ofsted don’t even count documents like “statements of curricular intent” as a source of evidence in inspections.

2. No forcing subjects to conform to whole school curricular aims like “big ideas” or soft skills

Big ideas might be incredibly useful in history, or English or psychology or whatever. I am personally not of the opinion that they are helpful in science. Some science teachers disagree, and that’s fine. But our subjects and our interpretations of them are different, fundamentally so. It doesn’t make sense to have all the different subjects in your school follow the same philosophy as their very essences differ in the way they approach truth, reality, generation of knowledge and meaningful experience.

I’ve seen some people talking about whole school curricular aims in a general sense like “creativity” or “critical thinking” or whatever. That’s great, but ultimately it isn’t going to be substantive, not because these soft skills are incredibly context dependent (which they are), but because they mean different things in different subjects. The creativity of a scientist is fundamentally different to the creativity of a writer or a geographer or an artist. Umbrella terms fail to understand this crucial distinction, and should therefore be avoided.

3. No assessing/evaluating curriculum without deep and extensive conversation with subject experts

Subject experts are kings of their curriculum. If you aren’t a subject expert, it’s going to be damn hard for you to really get to grips with what it is they are doing. If you’re a senior leader then you must make sure that you trust your people and hear them out properly. They will be the ones who can explain the reasoning behind the decisions made and without them your thoughts will be extremely superficial. Without their guidance, you’ll end up focussing on meaningless surface details like curriculum intent statements or whether it contains buzzwords like knowledge, metacognition, resilience or 21st century skills.

4. No snapshot lesson observations to assess curriculum implementation

Curriculum plays out over the long term, and you will only ever be able to see a small part of it. If in your observation cycle (if you have one) you get lots of time to see people, plus pre- and post-observation conversations so much the better. But if you don’t get that be incredibly humble about what you can see and what you can infer. One 20 minute observation a term isn’t going to cut it. You’re better off dropping in regularly and not just to one teacher. If you line manage science, get into as many science lessons with as many science teachers as possible. That’s the only way you’ll really get a feel for the quality of the curriculum a department is delivering.

5. No use of GCSE grades/sub-levels as progression models

I can’t stress enough how wrong this is. GCSE grades aren’t a progression model. Students aren’t supposed to progress from a grade 3 in year 7 to a grade 4 in year 8 and so on. The grade isn’t the progression model, the curriculum is the progression model. I’ve written before about this here, but you can also see this:

Or this, or this or this. Using GCSE grades as a progression model is not just silly (as that isn’t what they are for) but it warps your school’s practice and turns it into a factory aimed at delivering exam results, rather than delivering a beautiful curriculum.

6. No curricular omissions for different “groups” (e.g. PP, SEN, LPAs, M/F)

This has just got to die. You need to be rigorous and ambitious for all your students, and not allow the soft bigotry of low expectations to exclude students from the intellectual inheritance that is theirs no matter what their personal circumstances or key stage 2 scores. I’d recommend Ruth Walker’s now seminal E. Coli blog for more on this,

Shaun Allison and Andy Tharby also put this well in Making Every Lesson Count, in a sadly all too familiar story:

Alternatively, catch Bart Simpson’s take on this here.

To be sure, there will be some contexts where breaking this rule is justified (like special schools). But those cases will be the tiniest minority and your general rule should be a powerful and ambitious curriculum for all.

7. No using the word “knowledge” to repackage old ideas (e.g. “knowledge checks” to replace mini-plenaries)

This is one of Claire’s ones, so check out her elaboration. An extension to this of course is to be mistrustful of any CPD provider who is selling you the same product as they were selling you last year, just with a few different words. Which leads me on to…

8. No shelling out the big bucks for dubious CPD

If your inbox is anything like mine, then you’ve been inundated with emails selling CPD on curriculum. Obviously telling people to “be discerning”, isn’t particularly helpful, especially if we are dealing with something that people don’t already know a lot about. It’s hard to give any simple rules of thumb, but the below could be loose indicators:

  • If the provider are contradicting any of the don’ts on this list, then that’s a bit of a red flag
  • Mentioning Ofsted a million times is a red flag
  • People selling platforms (especially ICT ones) which are just rebranded versions of their old ones is a red flag
  • Research the presenters. If they’ve never written anything about curriculum then you don’t really have any reasonable grounds to ascertain if this CPD will be worthwhile. If they do, then you can judge whether it’s the kind of thing you are looking for.
  • See point 9…

9. No making changes until you know your stuff

Getting clued up on curriculum should be an utterly fascinating journey into the roots of the knowledge and understanding that comprise the very fabric both of your subject and the education your school provides. There are lots of incisive and thought-provoking writers out there, and as well as all the links above you can find further reading here, here and here. Go forth and work curriculum wonders!

Curriculum in Science: A Symposium

I started teaching in the Gove years; a time of enormous curriculum upheaval with all three secondary key stages seeing major changes in a bid for increased rigour, higher standards and improved performance on international assessments. In recent months, Her Majesty’s Chief Inspector Amanda Spielman has continued to increase the public awareness of curriculum matters through speeches and policy and, if the education commentariat are to be believed, “curriculum” is the buzz-word of the moment.

But curriculum is a bit of a chameleon word. To some, it means the substance of what is to be taught in a given subject or topic. To others, it represents the broad offering of content offered to students at a particular institution. Many will intertwine it with pedagogy and include discussion of teaching resources and approaches within their discussion of curriculum. Others see it through a political and sociological lens, a topic to be studied in terms of its ramifications on society and status as a support for political authority and hierarchy. Still others see it simply as the exam specification: the things my students need to know and understand to pass an exam.

To be fair, it is not a word which lends itself particularly well to a single definition. Only starting to be used extensively at the start of the last century, it is apparently descended from the Latin currere, which relates to running and dynamism; often used to describe the “careering” of a chariot.

curric graph
Graph showing the increased use of the word “curriculum” with time

This “careering” of the chariot indicates dynamism, movement and journeying. I therefore like to think of curriculum as the body through which a student moves on their journey from being a disciplinary novice to becoming a disciplinary expert. Their journey might “career;” it may stutter and start and appear disjointed and fragmented. But, from the wide angle of time, forward movement is occurring.

I think this is also what is meant by the word in its recent public appearances. What comprises the key content of a particular discipline? How are its ideas structured and related? Which concepts must be included and which can be optional? These are difficult questions, and many of us do not have the training to adequately address them. For a long time the educational focus has been not on curriculum, but on pedagogy. Curriculum was set centrally by the government, and it was teachers’ role to implement. But in recent days political freedoms and changing accountabilities have enticed schools to take a more active involvement with their curriculum, not just in terms of understanding its structure and purpose, but taking a role in its fashioning. Individual schools or multi-academy trusts will never have full control over their curriculum, but there has been an “awakening;” a movement towards teachers having a different, more dynamic, relationship with the curriculum.

I consider myself to be utterly untrained in this area. Despite having delivered close to ten different curriculums in my few years as a teacher, I still lack the technical knowledge and disciplinary jargon to discuss curriculum in a meaningful or scholarly manner. With respect to the study of curriculum, I am a novice. I also suspect I am not alone. With few exceptions, curriculum has not been a focus of initial teacher training, Masters programmes or leadership qualification courses. But with the gaze of Ofsted fixing itself on this area, schools will no doubt go into overdrive to find solutions.

With such widespread activity, the spectre of genericism looms large. No doubt, school leaders will be sent to one-day courses on the topic of “curriculum” and will become the schools’ “curriculum leads.” In a bid to apply partial knowledge and skill to a whole school in need of consistency, measures may be imposed across the board. We might see departments being told that their curriculum must be modelled against Bloom’s taxonomy, or core school values. Departments could be instructed to use their curriculum to support whole-school values and emphases like resilience or growth mind-set. The “cross-curricular” fad will rouse from hibernation and teachers will have to shoehorn geography into drama, and chemistry into art. Using catchy buzzwords, school proformas will be used to write “knowledge organisers” for each subject, with the same proformas used in science, maths, English and history. Imposition, imposition, imposition.

Such activity does violence to the very item it seeks to promote. For curriculums are unique and distinctive, based on the discipline they represent. The way that knowledge is structured, organised and sequenced in science is radically different to how it is structured, organised and sequenced in maths, English or history. And this is to say nothing of the disciplinary rules that underpin this knowledge. How is knowledge generated in science? How is it generated in maths? What epistemological rules are used to say “yes, this knowledge counts and can be admitted, but this knowledge does not, and must be dismissed”? Genericism threatens the rich disciplinary thinking that the focus on curriculum is supposed to promote.

This symposium seeks to anticipate and pre-empt such genericsm. Following our AfL in Science Symposium I am incredibly excited to be hosting a running series tackling the knottiest and thorniest of issues within UK Science education. Our contributors range from frontline teachers, through Initial Teacher Trainers and education academics. We are also delighted that Ofsted themselves will be contributing towards our discussion. Over the coming weeks, a new article will be published every few days, often trying to build and develop on the ideas introduced previously, concluding with a summary and remarks from Christine Counsell. We hope that interested teachers of all subjects (though especially science) find the articles to be meaningful and helpful, and look forward to the ensuing discussion.

Full list of contributors: Rosalind Walker, Tim Oates, Jasper Green, Pritesh Raichura, Niki Kaiser, Gethyn Jones, Matt Perks, Andrew Carroll, Alan Passingham (Ofsted), Christine Counsell 

Read Ruth’s piece here on foundational vocabulary for discussing curriculum

Read Gethyn’s piece here on how to build a curriculum with misconceptions in mind

Read Jasper’s piece here on the big ideas of science

Read Tim Oates’s piece here on the new practical endorsement and its history

Read Pritesh’s piece here on what material should go into a science curriculum and why

Read Matt Perks’ piece here on designing a subject knowledge enhancement curriculum

Read Niki Kaiser’s piece here on a memorable curriculum

Read Andrew Carroll’s piece here on constructing a PGCE science curriculum

Read my subsequent pieces on Core and Hinterland here, and sequencing a curriculum here

Read Ofsted’s short findings paper into primary science curriculum here

KS3 Science Curriculum: Some Tentative Thoughts

Last night on Twitter I reached out to hear people’s thoughts on how to go about writing a KS3 science curriculum bearing cognitive science in mind (1). I got some really interesting responses from people who are a) more knowledgeable than I am and b) more experienced than I am. I wanted to get down some tentative thoughts here, not because I think they will be exhaustive at all, but because it helps me to frame the debate. All of the things I am going to mention below could be the subjects of longer blogs (and maybe one day will be).

The Problem With KS3 Curricula

  1. Since the SATs were abolished, in many places there has been limited emphasis on KS3 science. In an era of high stakes accountability this is understandable
  2. The proliferation of KS3 bought-in schemes of work has enabled teachers to take the back seat when it comes to science curriculum planning
  3. In conjunction with 1.2 there are now thousands of different KS3 science resources floating around, all related to different schemes of work and different curricula, with different emphases. This has led to a very laissez faire approach to what is actually taught
  4. KS3 assessments tend to be very poor (in the four or so schemes of work I have used) and do not focus on the actual science that a student needs to know and focus a lot more on the scientific process. This means they are not a reliable or valid indicator of what the student actually knows. Add to this that some of the questions are just appallingly worded and based on an obsolete level-descriptor model.
  5. A lot of science teachers I speak to from across the country are struggling to deliver the new GCSE syllabus: it is more voluminous and conceptually demanding than any of the past. It is imperative that KS3 prepares students appropriately.
  6. Building on that, it is incredibly distressing to me that students appear to reach KS4 and have highly sparse background knowledge. This could be a function of both curriculum and teaching but is necessary to be addressed.

The Big Ideas of Science: What’s the Big Idea?

A number of people referenced this ASE document which details the Big Ideas of Science and says that all teaching should be in reference to them. There’s a lot of language in there that doesn’t really suit me (2) but I wanted to unpick a bit of the debate surrounding them last night (and today).

  1. What is the purpose of the Big Idea?
    1. The Big Idea could be there to kind of tie everything together so we don’t have a list of “disconnected facts”
    2. Is this a “thing in and of itself”? I.e. it should not be thought of in reference to its utility to some other goal (e.g. helping students to learn and systematise knowledge) but is important in its own right.
    3. Are these more of a planning tool than a teaching tool?
    4. It also could be there as we think this is something fundamental to understanding new science when presented to us, so we can fit it into a pre-organised and arranged system. Which leads me on to…
  2. Are these the same as “schema”?
    1. If they are, then that has implications for how we go about teaching them. As i have written before, the lack of reference to cognitive science in academic educational science writing is incredibly frustrating to me.
  3. Can you teach Big Ideas without reference to its “application”?
    1. I had a great discussion with Helen Rogerson and Helen Harden about this. Take for example statement BI “all the material in the universe is made of very small particles.” Presumably an “application” of that would be statement A “atoms are very small particles, which are, in turn, made of even smaller particles”

      In what order to you teach these BI and A? Do you start with BI and then give examples like A? Is it possible to understand BI without examples? But then what if you just teach A and then BI? Have you properly shown students that science is a coherent system with overarching principles?

      Would a compromise approach be to start with a lot of examples and then halfway through your course start introducing big ideas and then continually reference them?

  4. Who decides what these Big Ideas are?
    1. I haven’t read the document cover to cover but it gives a fairly extensive justification of each of the ideas
    2. Unfortunately in their panel, actual classroom teachers are massively under-represented with only a couple of the “senior” members ever having spent time in a classroom. No current teachers. This shouldn’t necessarily take away from the substance of the report but still irks me.
    3. Some of the principles seem a little, “unweighty.” Take for example “objects can affect other objects at a distance.” Yes, that’s true, but it’s hardly on par with the conservation of matter, which is not included in the list. It seems like someone took a list of topics like magnetism, gravity and radiation, figured out that they all involve objects affecting other objects at a distance, and categorised them based on that.
  5. Is there any systematic evidence to suggest that students who are taught using the big ideas are more effective scientists? If not then why should we pay them any heed?

Building a curriculum for knowledge

Something I suggested was to write a list of 600 questions that you want students to be able to answer by the end of KS3. You can then build your scheme around those questions, constantly referring to them and using knowledge organisers or some other method to engage in spaced and interleaved retrieval practice. You can use the Toby French approach to obtain a high success rate with your students and have them knowing a lot of fundamental science by the time they are done.

The obvious question is how do you decide the questions? This could be mediated by:

  1. Core knowledge that people need to access society (Hirschian approach?)
  2. Core knowledge that people need to make informed scientific decisions (e.g. vaccines, climate change etc.)
  3. Core knowledge that students need to succeed in KS4 (and make the KS4 teacher’s life easier)
  4. Things that are “awesome” and “wonderful”

There’s also the possibility of throwing the “how science works” stuff into the mix. My feelings on that are pretty strong, but will have to be a blog for another time.

Misconceptions

Common misconceptions need to be realised and thought about in planning. This is a very tricky area and I’m currently researching more about how misconceptions arise (in evolutionary psychological terms) and if they can be overwritten or if the best we can hope for is suppression.

History of science

George Pidgeon pointed out that the history of science is important too in terms of the way our ideas have developed. In my opinion this is important for a number of reasons:

  1. It gives students a coherent narrative. Perhaps could even replace the “Big Ideas”
  2. Can preempt misconceptions: for every daft idea you hear in the classroom, there will be a towering figure from the history of science who believed it was true
  3. Gives a more nuanced view of “scientific enquiry” than the usual “let’s plan a practical!” approach

Ease of delivery/preparation

There is also the rather pedestrian concern of actual delivery. Any curriculum would need to be deliverable by all teachers and properly resourced and assessed (though assessment is decidedly less pedestrian).

Compromise position

In short, there is much to think about. I don’t yet know what my dream KS3 curriculum would be, but this is a good starting point. Presumably there would have to be a compromise at some point – you can never please everyone!

 

 


(1) The background to this is that our department’s KS3 coordinator is leaving and this provides a great opportunity to go back to the drawing board. My recent thinking on this pays some debt to Michael Fordham’s piece. I am also aware that there is a gibongous amount of literature on curriculum design. I’m just a teacher though, so I have raised issues here as I see them.

(2) the Ten Principles of Science Education section is highly objectionable

Working with a bottom set year 11: how I do it

The emotional, physical and mental exhaustion of working with a bottom set year 11 class has its own characteristic flavour. You feel frustration at students who have switched off, annoyance at students who disturb others’ learning, fear for students who are working but not getting anywhere and, ultimately, inadequacy that you aren’t doing a good enough job.

You all know the kind of class I’m talking about, and we don’t need to go into minute and granular detail about its challenges and the characters that typically make it up. If you are going to have high expectations and standards for a class like this, you are certainly in for a challenge. I know from my own experience I’ve often felt like giving up or lowering my bar, and sadly sometimes those feelings have become reality.

I thought about having that sentence read “and sometimes I have let myself down by allowing those feelings to become reality,” but in truth I don’t know if I let myself down. It’s damn difficult trying to keep a brave face, calm demeanour and relentlessly high expectations. I reckon if anyone says that they’ve managed this the whole time they’re pulling a fast one. The perhaps unwarranted feeling of inadequacy lurks, but I think it’s important to be able to say to yourself “I’m doing all I can, and there’s a limit to what I can do.”

The above notwithstanding, it is what it is, and we need to deal with it. I think there are two very broad areas that need addressing when dealing with a bottom set year 11:

  • Lack of motivation
  • Lack of knowledge

These issues are of course not unique to year 11, but they do reach their peak there. These are students who, over the course of four years, have grown used to knowing little and caring less. An urgency is reached in year 11 that borders on panic: we must get these students some grades- we cannot allow them to leave school without the qualifications they need to lead happy and productive lives.

The lack of motivation and lack of knowledge are not unrelated. Often, students much earlier on in school miss out on crucial learning for whatever reason. This makes it harder for them to succeed. Experiencing failure time and again then tells them that this is what they will always feel and there is therefore no point in even trying. Obviously, it’s a little more complicated than that, but my approach to these classes tends to focus around that relationship between competence and motivation: the extent to which our competence in a particular domain affects the way we feel towards that particular domain.

Broadly, the philosophy behind the route outlined below is that helping students get good at something, helping them to feel some success, can be massively empowering and motivating, especially for students who are so unused to that feeling (more here). I really don’t want you to think that it’s a completely “fool-proof” route, or that other ways won’t work. It’s just what I’ve done and has worked a bit. It’s also not easy: it’ll take you time and perseverance as you combat deeply engrained issues and an inevitable feeling that you aren’t getting anywhere. I also have no idea if it would work outside of science as I have no experience of that. With those provisos in hand, I hope it helps.

  1. Get yourself a list of Core Questions

I’ve written before about the power of Core Questions in supporting students’ long term retention. Take a small part of your course, and turn it into a series of questions and answers. Try and keep them short and snappy, with no superfluous or redundant information. You might want to be judicious as well about high leverage ideas. There are some areas of your course which come up every single year, but also allow students to understand later concepts. Focus on those. A list from biology might look like the below:

1

You can see how tight the language is. As I’ve written about before, you might quibble with individual definitions’ wording here and there, but by and large it’s a good start.

  1. Take a small chunk and print it off

Choose a section that is both coherent in terms of content, but also manageable in terms of amount. Print it off nice and big, and fold it vertically so it looks like this:

2-1.jpg

  1. Show students how to use it

The goal here is to move to quizzing (retrieval practice) with the sheet really quickly. There are a number of different ways to do this, but so long as they are actively speaking or writing their answers (with writing being better) then they are doing good retrieval. With each of the below it may be best to encourage students to just try the first five and then slowly increase the number they are focussing on.

  1. Next to book and write

Pop the question sheet next to the book and write out the answers next to the questions. If you can’t remember one, leave it blank and wait till the end. Then get a different colour pen, turn the paper over and mark and make corrections. Rinse, repeat.

3-1.jpg

  1. Cut up

You or the students could cut the questions and answers up into mini-flashcards, with questions on the front and answers on the back. Students can then mix them up and use them for quizzing. Those of you who lovely glitzy resources could even make some kind of hat to mix them up in and pull them out.

  1. Verbal self-quiz

Both of the above work with individuals self-quizzing and speaking out their answers. I think on balance it’s better for students to write their answers, but speaking should work too.

  1. Peer-to-peer quiz

This can work really well with students working in pairs. However, there are two additional variables at play. The first is that it is more likely they will go off task and talk to each other, and the second is that if one student is asking the other student, only one student is actually doing retrieval practice. This potentially cuts effective learning time right down (if not entirely by half) so it’s worth considering.

  1. Don’t do it that way!

I guarantee at least one of your students will start trying to use the resources you give them badly. They’ll start highlighting, or just copying them out or just staring at them. They’ll say to you “this helps me sir” – it doesn’t. You’re the boss, you’re in charge. Get them quizzing.

  1. Circulate

Whilst all this is going on you need to be going round the room checking up on students, re-focussing their attention and helping them learn more. Some of the steps involved in that follow:

  1. Right is right

Students need to make sure they aren’t doing the classic “yeah that’s what I meant” thing. If they wrote x and not y, they need to be held accountable to that. You need to be really explicit with them that only right is right, and anything else isn’t good enough. If one student spends an entire lesson on one question just to make sure they get the correct phrasing: that’s not a bad use of time.

  1. Encourage

Students will go through periods of feeling like they haven’t got a clue and ones where they feel they are actually getting something. When they are in the former, remind them of the latter. When they are in the latter, remind them of the former and how proud they should be that they persevered through and have managed to really learn something.

“well done, that’s some great work there”

“you’re really getting this – I hope you’re proud of yourself!”

“yeah you got this, top job. Doesn’t that feel good?”

“see what you can do when you put your mind to it?”

“that’s some great work, but don’t give up now. I know you can do a bit more.”

I avoid using rewards, but do try to use praise for effort. Don’t praise activities that don’t deserve it, so if a student has done nothing for 40 minutes then spent 10 minutes working don’t praise them unreservedly, instead say things like “it’s good that you managed to do this work now, but it’s a massive shame you didn’t start earlier when I know you could have achieved so much.”

  1. Introduce variation

Once students start to master a list (and they will) introduce a bit of variation to maintain difficulty (see more here). Sometimes, all you need to do is change the order of the questions in the list to introduce enough difficulty as to maintain the challenge. Other times you might want to simply flip answers for questions and have students try to figure out the question from the answer. A little bit of rewording can go a long way too, saying things like “where does protein synthesis happen?” instead of “what is the function of the ribosomes?/where protein synthesis takes place.”

  1. Shake it up baby now

A room full of silent year 11s writing self-quizzes down three times a week for a year is probably not going to happen. Let’s be honest. You’ll definitely want to shake the routine up every so often, either by activities mentioned already in part 3, or by:

  • Book an ICT room and have them do mini-quizzes via the retrieval roulette
  • Give them questions as a semi-formal assessment to be done in silence and marked as a class
  • A bit of whole class teaching if there is something that you think needs more explanation
  • Leitner method work
  • Some straightforward old exam questions

Every so often I get tempted to do games or a competition, but I think we need to be a bit wary around stuff like that. Students have an ability to win learning games without actually learning anything and competitions can often just serve as a massive source of distraction “sir, he’s looking at his notes!” but I suppose it’s up to you. Hopefully nobody unfollows this blog if I say it’s ok to do a game once in a while.

  1. Low and slow

I’ve written before about the tension between covering content and trying to be thorough. Using this route, you probably won’t finish the course. But put it like this: is there any point in finishing the course if they don’t remember anything?

  1. Hold the line

It’s not easy to teach like this. It’s damn hard work and you need to be on top of your game. You aren’t going to manage that every lesson, but you need to try, and not let it get you down if it flops now and then. I remember one class I had the students used to bicker among themselves. Every so often there had been a flare-up in the playground or whatever: no learning happened the lesson following. It is what it is, and you need to be realistic. It could be that, it could be the afternoon, it could be a full moon, it could be anything. Pick up, and get back on it tomorrow.

  1. A lesson for life

None of the above is ideal. It shouldn’t be the case that you have year 11s turning up on day one who don’t know anything. Sadly, it is the reality in many schools. So here’s the thing: if you did all of the above, but in year 7, what would your year 11 bottom set look like? If you got them quizzing and retrieving and working at home early as they power through an ambitious curriculum that motivates through fascinating content and a growing feeling of mastery, what would your year 11 bottom set look like? Do you like what you see? Well, you’ve got five years. Get to it.


Final note: you could of course buy your class a set of knowledge quizzes. I don’t want this blog to be an advert for them as I know budgets are tight and I don’t think it’s appropriate for this blog anyway. But they will do a lot of the work for you, and the feeling of having a book completed and full of work will be powerful for your students as well. If you do want to order bulk copies, John Catt will do a discount.

What’s the Big Idea?

In 2010, the ASE published the Principles and Big Ideas of Science, the product of a conference involving a number of prominent figures within the academia of science education and edited by Wynne Harlen (1). Other than Millar and Abrahams’ article on effective practical work, I think it’s probably the most commonly referred to article within UK science education. I remember reading it during my PGCE, and not being particularly swayed by it. At the time, I was a little grumpy that there was only one of the Big Ideas (BIs) that was strictly chemistry, and that the omission of collision theory or the conservation of mass was a mistake.

With a resurgence of thought around curriculum, lots of people have been writing about BIs within science as a tool for shaping or framing science curriculums. As such, I’ve revisited Harlen’s BIs to try and get my head around it and see if, and how, I should be thinking about it more deeply. As I read through the document I realised that it wasn’t really just about the Big Ideas. It was about philosophy, ideology, pedagogy and curricular studies all rolled into one, so my discussion below is about all of those things. If you aren’t particularly interested in Big Ideas of science per se, do read on anyway: there’s lots of stuff in there that you may find interesting.

This is a really big area. Really big. There’s no doubt I won’t be able to do it complete justice here, but I’ve done my best and, as ever, look forward to the debate. This is certainly one for the curriculum nerds.

Frames of reference

It’s worth noting at the outset that there will be major disagreement between myself and the BIs. The first page of the BIs starts with “principles of education,” with number one as:

Throughout the years of compulsory schooling, schools should, through their
science education programmes, aim systematically to develop and sustain learners’
curiosity about the world, enjoyment of scientific activity and understanding of
how natural phenomena can be explained.

I’ve never seen a programme that systematically develops students’ curiosity, and I would be interested to see a curriculum that looks like that – I doubt it’s possible, and evaluation and measurement would be incredibly difficult. Either way, in my view, the last part of the sentence should come first. I think the purpose of a science curriculum is to systematically increase the amount of knowledge and understanding that students have of natural phenomena. A discussion for another time is where enjoyment and curiosity come in (but, broadly, once you know more stuff you find the world more interesting too), but it’s important to note that there are two very different frames of reference and understanding of purpose and aims here.

The ideological breaks continue down the principles. Principle 2 says that:

The main purpose of science education should be to enable every individual to take
an informed part in decisions, and to take appropriate actions, that affect their
own wellbeing and the wellbeing of society and the environment.

I think this is an important purpose of science education (and the authors acknowledge other ones) but would stop short of saying it is the “main purpose.” Where is the pursuit of knowledge for its own sake? Where is the best of what has been thought and said? Where are the cultural treasures that are the intellectual inheritance of every member of humanity? I think those things are important too and, as above, we have to note that there are different frames of reference.

The other principles have similar differences in ideology. Principle 5 for example requires that content should be of interest to students and relevant to their lives, which I don’t generally agree with, normally following curriculum theorist Michael Young in arguing that:

…for a curriculum to rely on the experience of pupils alone limits what they can learn to that experience….It is this structuring of knowledge independently of the experience of pupils that offers the possibility for pupils to think beyond their experience and enable them, as the sociologist Basil Bernstein put it, ‘to think the unthinkable and the not yet thought’ (Bernstein, 2000).

Continuing with problems in the principles, for some reason principle 9 mentions the importance of formative assessment. I don’t really know why it’s there: assessment is a vitally important classroom tool, but there are many classroom tools and formative assessment seems to be the only one that makes it into the list.

I think part of the purpose of this section is about emphasising differences. There are people out there who assert that all of the new developments in education have already come before and science teachers have always thought like this. The Big Ideas should show this not to be true: such an influential and prominent document with such clear differences to the contemporary zeitgeist cannot be ignored or passed off.

What do we mean by “Big”?

Before I revisited the document, I thought about how many different ways we could think about the word “Big” as it pertains to ideas. I came up with the below potential definitions and examples:

  1. Explanatory:  an idea like “collision theory” explains rates of reaction
  2. Encompassing: an idea like “energy” encompasses thermodynamics
  3. Historically important: an idea like “natural selection” has an enormous imprint on the history of science and society
  4. Politically/societally important: an idea like “climate change” is important in staving off disastrous planetary consequences

My understanding of Harlen is that the BIs are framed as an antidote to a number of problems:

  1. Students do not find science interesting or relevant
  2. They see the subject as disconnected internally (isolated strings of facts)
  3. They see the subject as disconnected from the world around it
  4. Learning science in school has elitist historical baggage and is still seen as not for everyone
  5. Abstract ideas in secondary school are not connected to concrete experiences “from which they should be built”

I think whether or not these things are still true is up for debate. If they are still true, I think whether or not BIs can solve them is certainly debatable (as we shall see). As one final “think”, I think we’ve had schools implement Big Ideas by now, and it’s incumbent on their advocates to run an evaluation and tell us whether or not these metrics have improved. To my knowledge, no such survey exists.

In terms of addressing a solution to the problems listed above, the authors argue that:

Part of the solution to these problems is to conceive the goals of science education not in terms of the knowledge of a body of facts and theories but a progression towards key ideas which together enable understanding of events and phenomena of relevance to students’ lives during and beyond their school years (my emphasis)

A bit later on, we get to what actually defines the Big Ideas:

Here we are using the term ‘idea’ to mean an abstraction that explains
observed relationships or properties. This is different from the everyday
use of the word ‘idea’ as a thought which is not necessarily based on
evidence. A ‘big’ idea in science is one that applies to a range of related
objects or phenomena, whilst what we might call smaller ideas apply to
particular observations or experiences. For instance, that worms are well
adapted to living in the soil is a small idea; a corresponding big idea is that
living things have evolved over very long periods of time to function in
certain conditions (my emphasis)

Which looks a lot like what I’ve called “explanatory” BIs above. After reading through an extensive discussion about the purposes of science education, the authors finally get to the criteria for selection of certain ideas about science:

  • have explanatory power in relation to a large number of objects, events
    and phenomena that are encountered by students in their lives during
    and after their school years
  • provide a basis for understanding issues involved in making decisions
    that affect their own and others’ health and wellbeing, the environment
    and their use of energy
  • provide enjoyment and satisfaction in being able to answer or find
    answers to the kinds of questions that people ask about themselves and
    the natural world
  • have cultural significance – for instance in affecting views of the human
    condition – reflecting achievements in the history of science, the
    inspiration from the study of nature and the impacts of human activity
    on the environment.

Other than bulletpoint 3, that looks quite a bit like my list. What I’ll do now is briefly examine a couple of their ideas in light of the stated criteria. I’m not going to dive into their “ideas about science” now, and will limit my discussions here to the “ideas of science.”

Big Idea 1: All material in the Universe is made of very small particles

This, the first of the BIs, is my terra firma. I imagine this one was selected for its explanatory power and for its importance in the history of science (2). My problem, though, is with how it actually has explanatory power. It seems to me that what the authors are proposing is a process that goes a bit like this:

  1. Gather naive ideas as a child
  2. Slowly have those ideas challenged and experience cognitive conflict due to teacher-facilitated activities and inquiries (more on this later)
  3. Learn about specific “small ideas”
  4. Start to appreciate the “big idea” that explains the “small ideas”
  5. Be able to use knowledge of the “big idea” in an unfamiliar context

My problem here is with the sequencing and commonality. Let’s take particles as a simple example. We’ll skip over steps 1 and 2 for the minute, because 1 is obvious, and 2 probably a waste of time. Let’s think about the sequencing of small ideas to build up to our particle big idea.

In Year 7, I might teach students about the atom, molecules, giant structures and chemical bonds. Throughout all of that, I might emphasise that it’s vital to recognise that all substances are made of these tiny little particles. In future years, when doing separation techniques I might talk about what happens in terms of particles when substances are dissolved and filtered. In a later year still, I might talk about rates of reaction and collision theory. All of these illustrate the big idea. They have that in common. But there are two problems:

First, the big idea might be fully understood (as well as all the other information the BIs document includes within this big idea), but the student utterly incapable of making prediction based on it. As an example, let’s take dynamic equilibrium. You can’t really understand that without understanding that all matter is made of particles. But this particular Big Idea is laughably inadequate as a tool for explaining how a reaction might reach equilibrium. There are so many other things you need to know as well before it even begins to make sense. Put another way, if I had two students, one of whom had seen this Big Idea illustrated by filtration, and one who had missed that lesson, do I really think the first one is more likely to understand or predict equilibrium? Not a chance.

Secondly, I don’t even think it’s the most interesting thing that these topics have in common. I have some really outstanding year 11 chemists at the moment. If I asked them what do rates and filtration have in common, the least interesting answer possible would be “they are both explained by conceiving of matter as made of particles.” Even if we took closer concepts like rates and equilibrium, the most interesting answers would be about the effect of changes to the system, like pressure or temperature. And sure, those effects are explained by a kinetic and particular model, but that isn’t the interesting bit. The interesting bit is what is actually happening. The fact that we just have particles isn’t exciting, it’s obvious: tacit. The way they move, collide and result in observable effects is exciting. If I spent any time at all in class trying to tell students “hey look! These both involve particles! Scientists never used to be able to understand this, because they didn’t realise matter was made of particles but thought it was about earth, wind, water and fire!” they would look at me like I was nuts. And that’s for the two topics that are probably closest in their relation. Fractional distillation also relies on a particular model, but that’s probably the least exciting commonality it has with rates, equilibrium or any other such topic in chemistry.

Big Idea 2: Objects can affect each other at a distance

Absolutely. 100%. They can. And if you don’t understand that, you can’t understand gravity, electrostatics or magnetism. But if I asked my year 7s “I have just lifted a pen, and then let go. It fell to the floor. Why?” and someone answered “because objects can affect each other at a distance” they wouldn’t be wrong, but it isn’t right either. It’s not enough. It’s more of a threshold concept: you need to get over this to understand more, but it isn’t going to actually explain anything. And if it can’t explain anything, then it fails the tests that the authors set.

Not a Big Idea: Surface area to volume ratio

I’ve made the bold claim before that surface area to volume ratio (SA:V) is the most powerful concept in the whole of school science. In opposition to what’s been mapped out above, it explains:

  • Adaptations for heat loss
  • Adaptations for travelling on sand/snow
  • Structure of cells like the root hair, alveoli or microvilli
  • Heat loss
  • Effect of SA on rates
  • Nanoparticles’ odd properties

And so on, and so forth. But it doesn’t qualify as a “Big Idea.” And I wonder why not.

Big Idea 5: The composition of the Earth and its atmosphere and the processes occurring within them shape the Earth’s surface and its climate

This is also true. And it’s important: really important. I doubt there is a more important general topic than climate change at the moment. And yes, this doesn’t explain much, but remembering the authors’ selection criteria, it could satisfy two other bulletpoints: relating to individuals’ abilities to make decisions about their life and things of general cultural importance. But I don’t know why you need a “Big Idea” per se for this. I want students to know an absolute ton of information about the climate as it changes. I need them to know this so they can make informed decisions about the future of humanity. But that means that this “big idea” is really just a heading, a title. It’s the name of the topic, but no more than that. It’s general enough to encompass smaller topics like global warming, combustion, how the emergence of life has affected the composition of the atmosphere, but it doesn’t do more than just encompass them. It doesn’t explain them, predict them or make them any more readily understood. So one can legitimately ask: do we need a Big Idea? What’s wrong with just a list of topics?

Principles of awesome

I like the list that the authors have written. Every item on the list is a scientific construct that is important, and is necessary to understand other, “smaller” ideas. But they are not even close to sufficient. To me, they just represent some really awesome things that humanity has discovered over time. Genetics, cellular theory, thermodynamics: these are all awesome. But claiming that you can reduce them to one “Big Idea” that has some kind of power all of its own doesn’t work for me.

Curriculum, pedagogy and social constructivism

It’s worth noting that throughout the document, the authors argue in favour of a social constructivist model of education, with a heavy emphasis on inquiry learning. For example:

Appreciation of how science knowledge is developed should be derived at
least in part from experience of undertaking scientific inquiries of different
kinds…participation in forms of inquiry provides the experience for students to develop understanding about science and how scientists go about their work…Implicit in all of this is that students are taking part in
activities similar to those in which scientists engage in developing
understanding.

I’ve written extensively before about why this isn’t true. In short: our students are not generating new scientific knowledge: that’s what scientists do. They are learning old knowledge, and to say that because scientists use inquiry students should too is to make a category error.

The promotion of inquiry-learning does not stop there:

An inquiry-­‐based approach is widely advocated and is being implemented in many different countries across the globe. Inquiry, well executed, leads to understanding
and makes provision for regular reflection on what has been learned, so
that new ideas are seen to be developed from earlier ones…There is growing evidence that this has a positive influence on attitudes to science

And here:

Undertaking scientific inquiry gives students the enjoyment of finding out
for themselves and initiates appreciation of the nature of scientific activity,
of the power and the limitations of science.

Of course, this is 2010. And then, the orthodoxy was indeed to promote inquiry learning. We now know differently, and many are shifting their practice towards explicit teaching in light of extensive evidence (I have written a bit about this starting here). And it’s also important to note for those who deny that inquiry was ever promoted in the highest echelon, that it was. It really was. And completely uncritically (3).

Pedagogy dictating curriculum?

Later, the authors argue this:

Inquiry-­‐based teaching is demanding, both of teachers’ skill and of time for teaching and learning. Inquiry-­‐based learning can lead to greater depth in understanding but as it takes more time the corollary is that the breadth has to be reduced. Thus identifying big ideas in science is a natural, and indeed necessary, accompaniment to promoting inquiry-­‐based science education.

To me, this is highly problematic. It is essentially arguing that because we want to do inquiry, we should change what we are going to teach (4). This is all wrong in my head. We need to decide what we are going to teach before we decide how we are going to teach it. A teaching technique is only useful if it bridges the gap between the student’s brain as it is now, and what I want my student’s brain to look like in five years’ time. To me, that is self-evident (and more here).

Can you build a curriculum without Big Ideas?

I’m really proud of the science curriculum that we’ve built at my place. It’s incredibly powerful, and our students know loads of science which they can apply in other contexts. They are excited, curious, and deeply knowledgeable. But, we have not referenced a single “big idea” in our planning, explicitly or implicitly. Sure, we include things like “everything is made of particles” or that “objects can affect each other at a distance” when we are teaching, and we spend the requisite time on them. But they don’t frame the curriculum: they aren’t the end goal. The end goal is students who know an absolute ton of science and can use it to be creative, inventive and brilliant.

A further problem: specificity

If you view your students as intrepid explorers of the physical world; guided all the while by wise and facilitating science teachers, then it makes sense to leave your curriculum pretty open. You want teachers to personalise instruction to the students in front of them: their interests and personal experiences. Under a model like that, the BIs make sense because of how much they encompass. You can frame a curriculum pretty loosely around them and fulfil your curricular intent.

Obviously though, if you view school science as a specific package of knowledge that you want your students to have, the loose model starts to break down. You start to see your students not as individual exploring mavericks, but as novices progressing through a predetermined body of knowledge and understanding. Teachers must respond to their class, both in their pedagogy and what they choose as their hinterland (all the content they use to embellish and elaborate on the core material), but sensible curricular leadership appreciates that specificity – all teachers knowing exactly what to teach and when to teach it – is a powerful model that, perhaps paradoxically, liberates students to “think the unthinkable and the not yet thought.” Big Ideas might still be relevant in such a model, but they cede primacy in curricular design to careful delineation and sequencing (5).

More useful constructs

If the Big Ideas are going to get people thinking about curriculum I’m on board. But I think there are better lenses (or constructs) through which to think deeply about curriculum. I really recommend the maps and visualisations that Ruth Walker has put together on this – trust me, you won’t regret reading her blogs (6).

Conclusion

I wrote this blog because I’ve seen a lot of people asking about Big Ideas, with the three scientific learned bodies all using it to frame their curricular work over the last year or so (7). Here’s my advice: if someone asks you to write your curriculum around Big Ideas (Harlen’s or otherwise) follow these steps:

  1. Check that it isn’t driven by progressive/social constructivist pedagogy. If yes, challenge.
  2. If no, ask why? Are these really the most helpful constructs for framing our curriculum? Or is it just a cute idea that doesn’t really aid our thinking?

 

 


(1) There are no serving teachers on the editorial board. It looks like a couple of contributors did teach, but it is underemphasised relative to their academic achievements. That’s not to say that academics and scientists do not have a right to an opinion: they do, and their right and expertise is important. But teachers have that right too, and it seems odd for there not to be any teachers included. It’s also worth noting that part of the distinctiveness of the current focus on curriculum is the number of frontline teachers involved.

(2) Many thanks to Bill Wilkinson for pointing out that Feynman argued that: “If, in some cataclysm, all of scientific knowledge were to be destroyed, and only one sentence passed on to the next generation of creatures, what statement would contain the most information in the fewest words? I believe it is the atomic hypothesis that all things are made of atoms — little particles that move around in perpetual motion, attracting each other when they are a little distance apart, but repelling upon being squeezed into one another. In that one sentence, you will see, there is an enormous amount of information about the world, if just a little imagination and thinking are applied.”

(3) Many thanks to Gethyn Jones for prompting me to think about this

(4) Though there are many who no longer endorse Inquiry methods, I was recently condemned for publicly criticising the December 2018 issue of the SSR for containing 7 articles about inquiry presented completely uncritically. If you are aware of all the evidence against inquiry learning and you still think it’s a good idea that is your right, of course. But it is completely disingenuous to pretend like there is no evidence against it.

(5) As a side note, it is worth noting that traditional or explicit methods to teaching are hardly easy or without effort. I challenge anyone to look at my SLOP booklets and tell me I haven’t worked for them. I just focus my efforts in other directions: instead of desperately trying to manage behaviour in a busy and moving classroom, or direct my energies towards gently coaxing students to discover correct answers for themselves, I spend time thinking about sequencing, practice and quality of explanation.

(6) I’d also recommend her blog for the Curriculum in Science symposium which provides a glossary of important vocabulary and conceptual tools by which to think about curriculum.

(7) Even if they conceive of them as “big questions”, which strikes me as the exact opposite of Harlen’s Big Ideas. BIs are not questions – they are answers.

Planning smarter: rethinking the short, medium and long term

A lot has changed recently. The new emphasis on curriculum and knowledge have led many to think about their teaching in completely different ways. For me, one of the big shifts has been the move from my teaching being resource- or activity-led to being content led. I used to do group activities, discovery learning and other techniques simply because I thought they were good techniques: not because I thought they suited the content I was due to be teaching. Now, I try to think about the what of my lessons first, and the pedagogy second.

One of the ways in which this change has manifested has been in planning and how to think about progression. My planning used to be in three main layers: the short, medium and long term. Obviously, those terms don’t mean anything beyond the relative time period associated with them: what actually gets done in each of them, as well as the actual amount of time involved, is open to interpretation. Below, I’ve tried to describe what these terms used to mean to me, the problems with how I thought about them, and what I do now, both myself and with our new KS3 course.

Short term

Here, the lesson plan was king. During my PGCE I was trained to write minute-by-minute guides to what I was going to do in the classroom, with carefully written descriptions of what I would do and what the students would be doing. Even then, I realised that this was silly. Teaching is about responding and adapting to the class that’s in front of you. Putting a time limit on yourself doesn’t make sense, what if you need more than 4 minutes? What if you need less? You need to be able to adapt.

A further problem is the very arbitrariness of using a lesson as a period of time for useful planning. Some schools have 50 minute lessons, others have 65. Those decisions are made for whatever strategic reason, but they certainly aren’t made because Adam’s lesson on polymers needs 50 minutes but not 65. Learning isn’t so neatly divisible. Some topics take more time, others take less time. Trying to stretch or squeeze the time it takes to learn something into an arbitrary time period makes little sense and does violence to your subject matter.

I appreciate that the way that novice teachers plan is different to the way that expert teachers plan. But that doesn’t mean that the Lesson Plan is the answer. It’s a relic of a bygone age and without a doubt we need to think a lot harder about how we support novice teachers to plan lessons.

Medium term

This is what medium term planning used to look like to me:

Slide1

In truth, this isn’t so different to how a file would look to me now, but I think the problem here is that it is just kind of an extension of the short term lesson plan. A medium term plan has to be coherent and well-sequenced. It needs to really anticipate how the ideas build on each other sequentially to form a better picture. I don’t know if I used to do that, to think in those terms, or if I just lurched from one lesson to the next without much thought. Take a look at this unit for example:

slide2.jpg

The “messy folder” will probably be familiar to many. Messy folders aren’t just something quirky that only obsessives worry about: they result in teachers not knowing precisely what to teach, and instead just going through the folder to find a resource that is teachable (I wrote about that here). That’s a big problem, but even if we removed the problem and the folder were neat and tidy there’s an issue with the topics here. Endothermic and exothermic reactions have nothing to do with ceramics, polymers and composites. They are completely different topics from completely different areas of chemistry. There is no coherence to putting them in a unit together, they’ve just been shoved together for whatever reason. Sometimes this is done for reasons of time, that it doesn’t make sense to have a short unit so you have to put things together. But it’s so obviously wrong that it betrays a weakness of curricular thinking. Units need to be logical and coherent: doing otherwise sends completely the wrong message about how your subject is structured.

Long term

This is what the long term used to look like to me:

slide3.jpg

I have big problems with this. Sure, it’s both pretty and colourful, so passes the “would my assistant head be happy with this?” test. But I don’t understand how it can actually work. Any time I’ve worked on a timetable like this I’ve fluffed it up. I’ve either been ahead of the plan, or behind. When ahead you end up doing momentum-killing revision or research lessons, when behind you end up rushing material just to catch up. I just don’t get how it works, or how you can predict with that level of accuracy how the year is going to progress. Things happen in school the whole time which can throw off these lovely timetables: visits, trips, drop-down days…they all conspire to completely ruin your plans. The veteran teachers just ignored the plan and got on with it, which was fine for their students there and then, but meant we got to the end of the year and different students were in different places: hardly a ringing endorsement of your department’s long-term planning.

So…now what?

I’m really lucky to work for a great team. We’ve tried to do a lot of things differently, and one semi-conscious choice has been around planning. Below, I will try to describe what we do and I’ll start with the biggest picture and then zoom in

Long-term

This is our overall plan of what we want students to know by the end of Key Stage 3. So not the end of the year, but the end of three years. There are a lot of considerations as to what makes it in and what doesn’t, but once set, it’s in stone. All students will know this material by the time they turn up for GCSE in year 10. That’s the goal: that’s the super-long-term.

In theory. In truth, we can’t really know how long things will take. So instead of saying “ah you must have finished B3 by December the 7th at 12.05pm, we have a plan that looks like this:

Slide4

The content takes the lead – not how long it will take. These are the topics, this is the order at which we are going to work through them and they will take however long they take.

Of course, we do still need to keep people roughly in the same place. We can’t have classes finishing the year an entire unit ahead or behind others. To address this, every six weeks or so, we do a checkpoint where everyone writes on a spreadsheet where they are up to with their class. In almost all cases, if a class is “behind” we slow everyone else down to that point. Likelihood is that teacher is just being more thorough. So the other groups slow down by doing more mini-quizzes or spending more time giving feedback after them or doing more practice work. Things which are good both in a learning sense, and in a strategic sense.

Part of the long term plan is also the sequencing of topics. I wrote more about this here, but for example we do C1 first so that we can introduce symbols and equations, then we do P1 so we can apply that to a combustion reaction and then do B1 so students can learn about chloroplasts and mitochondria in terms of energy stores, transfers and chemical equations.

Medium term

This is the unit. How are we going to build knowledge in a way that is coherent and conducive to student understanding? How are we going to relate it to other topics taught? We might do this in chunks like 1. The Solar System, 2. The Earth, 3. The Moon or whatever, but those don’t represent one lesson. You might spend one and a half lessons on The Solar System, a quarter of a lesson on The Earth and then five lessons on The Moon. It doesn’t matter: what matters is that the material is learnt well, however long it takes. And if you are halfway through teaching The Moon and you realise there is something from The Solar System (or a different unit entirely) that your students don’t understand? Go back, do it again. No other way.

Short term

This is the simple explain–>practice–>review cycle. Not a Lesson Plan – it’s just planning for learning. So I might want my students to learn that the Moon has phases because of its position relative to Earth and the Sun. I’ll think about the best way to explain that, how to support students to progress towards independent practice and then how to go over their work properly. And again, it takes however long it takes. I spent 45 minutes recently with my year 7s explaining the difference between mass and weight and how we can use that to construct an equation to relate weight, mass and gravitational field strength. It was a further one and a half lessons before they were practising using the equation completely independently. I had no idea in advance it would take that long – it’s just the amount of time that the content required.

That’s all folks

So there we have it. No doubt, as with all my posts, someone will pop up to say “that’s how we’ve always done it” or whatever. That’s cool, I was very careful to couch this as my experience. I’ve got a hunch that I’m not the only one though.

What to do after a mock? Assessment, sampling, inferences and more

A common question in the #CogSciSci email group is what to do after students have done an assessment or a mock. Most commonly, people spend a lesson “going over” the paper, where the teacher goes through each question and students make corrections. There’s often some accompanying document for students (or teachers) to fill in tallying their errors in particular sections. Highlighting is normally involved. Personally, I don’t think this approach is hugely beneficial (for the time spent on it) and below I will explain why I think this and conclude with what I do.

Student psychology problems

The first thing to note is what is going through the students’ heads when you go over a test. Likelihood is, they aren’t really paying attention to the feedback, and are more focussed on the grade or score they got. In my experience this is because they are any of:

1) just plain upset and demotivated (“I can’t believe David did better than me”)

2) focussed on finding a way to scrounge another mark (but sir that’s exactly what I wrote!) or

3) so smug about their score that they don’t feel the need to pay attention.

This even changes question by question: if half the students got a question right, as soon as you start going through it unless they are the best students ever they just won’t be listening to you any more. As such, you will have a lesson where not all students are thinking throughout (1).

Sampling the domain

There is a more significant objection, which starts with a concept from assessment theory: sampling and inferences. Let’s look at this image, to represent all the things we want our students to know about a subject (the domain):Slide1

The black spots are “things you want students to know” and the lines between them are “connections.” So one black spot might represent “sodium has one electron in its outer shell” and a line might take it to a spot which states “when metals form ions they lose electrons from their outer shell.” If I ask a student “what happens when sodium becomes an ion?” I am testing those two black spots and the line between them.

We want to know how many of these spots and lines students have in their heads, but we simply cannot test whether or not a particular student has learnt everything in the domain: the tests would be too long (2). So instead, we pick some specific things from the domain and test those:

Slide2

If a question asked: “what happens when sodium forms an ion?” we are testing two green spots and a connection between them, as above. The important bit comes next: the inference. If a student can adequately describe what happens when sodium forms an ion, I infer that if I had have asked them about say what happens when magnesium forms an ion, they would have got that right too. If that inference is justified, then it is valid. An invalid inference might be “this student knows what happens when transition metals form ions,” because the rules are different. The rule that governs sodium is the same one that governs magnesium, but the one that governs transition metals is different, so that inference would be invalid. The image below shows the things I tested in green, and the things I infer the student knows in orange:

Slide3

So the purpose of the test is not to figure out just if a student knows the things on the test, it’s to figure out how much of the stuff that wasn’t on the test the student knows.

Assessment maestro Daniel Koretz puts it like this:

“most achievement tests are only small samples from much larger domains of achievement. For example, responses to the small sample of items on a mathematics test are used as a basis for inferences about mastery of the mathematics learned over one or many years of education.

In this way, a test is analogous to a political poll, in which the responses of a relatively small number of people are used to estimate the preferences of a far larger group of voters. In a poll, one samples individuals, while in a test, one samples behaviors from an individual’s larger repertoire, but the fundamental principal is the same: using the sample to estimate the larger set from which it is drawn. For this reason, scores are only meaningful to the extent that they justify an inference about the larger whole. That is, the validity of inferences depends on the extent to which performance on the tested sample generalizes to the much bigger, largely untested domain of achievement.”

This is also the key to understanding your mock. If a student got a particular question wrong, you don’t want to spend time in class going over that particular question. The whole point of that particular question is to give you an inference over something else – the extent to which the student understands the domain. Going over that question means you are going over the sample, when in reality you want to go over the domain. If 30 students get a question wrong, don’t go over that question. Go back to the domain that the question came from, and go over that.

There’s a simpler objection here too: it takes a long time to learn things. To my shame I only really realised this a couple of years ago, but without regular review and retrieval, things won’t get learned. So if students get something wrong in a test, going over that thing once in class is not a good call. Go back to the domain. Think about what their wrong answer tells you about the stuff that wasn’t in the test. Then do that. And do it again in a few lessons time. Add it into a retrieval roulette – whatever, just make sure you do it again.

Curriculum as a progression model

You may have heard people (including Ofsted) describe the curriculum as the progression model. The meaning of the phrase ties into what we were looking at earlier; as soon as your students start studying your domain, a long term outline might look like this:

Slide4

Certainly, it’s probably not as neat as that, and the way you have planned to cover your domain might look a bit more like this:

Slide5
Curriculum planning is messy because domains are messy. This is an over-neatification.

The point is, your students are progressing through the domain. Whether or not they are “making progress” right now is to answer the question: how much of what I wanted them to learn this year have they learnt? As I’ve argued before, you cannot make any sensible decisions about pedagogy, teaching and learning or assessment until you have defined your domain and thought about what progress through it looks like. The test that you give your students measures progress in the sense that you infer from it the extent to which they have progressed through the domain.

GCSE grades are not a progression model

You will often see schools grading students from year 7 by GCSE grades, so they might be a 2 in year 7, 3 in year 8 etc, all on a flight path to get a 6 at GCSE. This is wrong, because grades are based on the entire domain: you get to the end of GCSE, this is what you are supposed to know. We find out how much of it you know as a percentage, compare you to other students and give you a grade. So in year 7-11 you aren’t progressing up the grades because the grades only work when applied to the entire domain. Using a GCSE grade on any one of the circles above is ridiculous, because the grade is a measure of the whole thing, not of one of the smaller circles.

Question level analysis (RAGging)

In Koretz’s passage above, we said that assessments are about inferring a student’s knowledge of the domain based on their performance in a sample of the domain. Whether or not you can “generalize” like this depends on other variables involved as well as student knowledge. The layout of the question, the amount of text, whether there is a supporting diagram, the stakes associated with the test, the time of day, whether or not the student slept or had breakfast or just got a message from their girlfriend or whatever. Those variables need to colour my inferences, and the fact that I don’t – and can’t – adequately assess their relative impact on the student means that I need to be very hesitant with my inferences. Triangulation with other data helps, longer tests helps, tests designed by someone who knows what they are doing helps, standardising the conditions helps and so on and so forth.

The effect of these other variables is what makes Question Level Analysis (where you look at the marks students got question by question) a highly dodgy enterprise. If students got a question on electrolysis wrong and one on rates right, that doesn’t mean you can infer they know rates and don’t know electrolysis. It could be that the question on electrolysis was just a harder question, or that the mark scheme was more benevolent, or that its layout on the paper was worse or it had more words in it or whatever. It might be that the question on electrolysis is incredibly similar to one they did in class, but the one on rates was completely new. You just can’t really know, so going crazy at the question level doesn’t strike me as sensible.

A further curve-ball is the “just right” phenomenon. It often happens in science (don’t know about other subjects) that a student manages to get marks on a particular question, but only just. They’ve said something which is awardable, but to the expert teacher it’s pretty obvious that their understanding is not as strong as another student who also got all the marks on that question. This further destabilises QLA as an enterprise. Still, you get some pretty coloured spreadsheets, and your assistant head for KS4 will be really impressed with how you are going to “target your interventions.” (3)

Grade descriptors are silly

A further ramification of all of the above is about grade descriptors and worrying about things like AO1, AO2 and AO3. Let’s look at an example first.

In the reaction between zinc and copper sulphate, energy is released to the surroundings and the temperature increases. If you stick them in a beaker together with a thermometer you can record that increase.

A common question would be to suggest an improvement to this experiment, with the answer being to put a lid on the beaker. If student A gets it right and student B gets it wrong, what inference can be made about their wider knowledge? Can I assume, as per grade descriptors, that student A is better at this:

“critically evaluate and refine methodologies, and judge the validity of scientific conclusions”

GCSE Grade Descriptors

Obviously not. I can’t infer from their performance on this question that they would be better able to suggest an improvement to an experiment about rates, acids or electrolysis, because that’s just not how it works. Put graphically, if we look at the below, the area in the red circle represents the topic “acids” and the area in the blue represents the topic “energy”:

ImagesEach topic might contain a spot (given in green) which is about an experiment: an experiment about acids (red circle) and an experiment about energy (blue circle). But if we only test the red experiment, we can’t infer that they know the blue one, because they are in totally separate domains. Sure, they have in common that they are “questions about evaluating and refining methodologies” but they also have in common that they are “questions made up of letters and words” – just because they have one property in common doesn’t mean you can infer knowledge of one from the other.

The same is true of stressing out over whether a question is AO1, 2 or 3. It may be the case that question 5 in the paper is AO2, but student A getting it right isn’t necessarily “good at applying knowledge” and student B getting it wrong isn’t necessarily  “bad at applying their knowledge.” You can’t infer that, because you are essentially saying that if question 6 is also AO2, then student A is more likely to get it right than student B. That would be ridiculous, because it depends on the actual domain – the knowledge that is being tested – not other properties that questions 5 and 6 have in common. If question 5 is about acids and question 6 is about energy, then the students’ knowledge of energy is going to determine how well they can answer question 6, not their “ability to apply their knowledge of acids in unfamiliar contexts.”

Images 2
You can’t predict anything about a student’s response to question 6 based on their response to question 5. This is why feedback of “improve AO2 response” or “get better at applying your knowledge” isn’t meaningful.

What I do

As I’m marking student tests, I’ll look for glaring common errors. I address these on three levels:

1. This class, right now

Before I’ve given the papers back, I might spend a little time going over a couple of big issues. But I go back to the domain, so I don’t say “right there was a question about sodium bonding with chlorine that we got wrong so we’re going to do that again” I go right back to when I first taught students about bonding and say “there were some confusions about how electrons are transferred in ionic bonding so we are going to look at that again.” I teach a bit, give students some practice and then move on.

2. This class, in future

More importantly, I want to make sure this class have the opportunity in future to come back to this. So I might pencil in for next week to do a retrieval starter based on electron transfer, or build it in to a worksheet they are going to be doing next lesson.

3. Future classes

This is probably the most important level. I’m going to be teaching this content to three new classes next year. What am I going to do now to make sure that Future Adam spends a bit more time on this next year, so Future Adam’s students know this domain better this time next year? This is part of the reason why SLOP is powerful: I edit and change the booklets as soon as I’ve taught and assessed a unit, so I can make them bigger and better for the next group (as well as being used later as revision for this group).

Some do’s and don’ts

I realise I got a bit carried away on this post, so here’s a summary. Even if you to decide to do a don’t or don’t a do, I hope it’s a useful framework to think about assessments:

Don’t:

  • Spend hours in class looking at the sample rather than the domain
  • Go crazy looking at the marks students got for each individual question
  • Give GCSE grades for individual assessments
  • Use GCSE grades as a progression model
  • Don’t infer anything about students’ general abilities like “evaluative thought”
  • Go crazy over the AO of a question

Do:

  • Think about what your students are thinking about (or not thinking about) when you are going over a test
  • Go back to the domain
  • Plan to revisit the domain
  • Be sensible about what you are inferring about your student’s mastery of the domain
  • Treat the curriculum as the progression model

 

 


(1) see here for more on that

(2) According to assessment boss Deep we’re miles away from even thinking about testing an entire domain. He recommends reading these

Kingston, N. M., Broaddus, A., & Lao, H. (2015). Some Thoughts on “Using Learning Progressions to Design Vertical Scales that Support Coherent Inferences about Student Growth.” Measurement: Interdisciplinary Research and Perspectives, 13(3–4), 195–199.

Rupp, A. A., Templin, J., & Henson, R. A. (2010). Diagnostic measurement: Theory, methods, and applications. New York, NY: Guilford Press

Not only would they be too long, but they would also probably not actually be particularly valid. It’s conceivable that a brilliant science student has mastered all the content, but in a different order and structure to how another brilliant science student did. A test which aimed to test each part of their cognitive structure might not account for individual idiosyncrasies in schema construction.

(3) see here for more on this

 

All the SLOP you need

I have finally finished updating my booklets. The page where I used to keep them got a bit messy so I’m starting a new one here. In terms of changes, I have fiddled the sequencing on a bunch of them, fixed a load of typos and added a shed-load more practice. I’ve tried especially hard to add “interleaved problems” which is really a questions sequence that ties together a lot of different topics. You can find most of these at the back of each booklet.

The booklets are also organised slightly differently as we have changed our order of teaching, and we now teach topics roughly in the same order as they appear in the AQA spec (though material is common to all boards). I’ve covered basically the whole course, there are a couple of bits and pieces which I use other resources for but almost everything is there. I’ve put some powerpoints there too. I don’t really use these much anymore, so can’t remember if they are any good but they may help.

As ever, they are free to download and I hope you find them useful. All I ask in return is that you let me know if you spot an error or something that I could do differently. If you write answers I’d appreciate those too if you are willing to share. I’m also very interested in knowing how people use them so don’t hesitate to be in touch on that either. If you want to know how I use them, click here.

Please remember: these aren’t just for the students: they are for teachers too. I have tried to show how different units and topics can be broken down and sequenced logically. I’ve tried to show how you can build in techniques like retrieval and interleaving to promote better learning and if you’re interested more in that I suggest you check out #CogSciSci.

I do update them quite frequently so I wouldn’t advise downloading them all in one go, but coming back every time you need to teach something and re-downloading.

While you’re here hunting for resources, please feel free to have a look around at the rest of the site and if you are interested in science, teaching, curriculum, cognitive science and educational research do hit subscribe.

Unit 1 Atomic structure and the periodic table

History of atom + elements and compounds mastery answers

History of atom + elements and compounds mastery

History of the atom ppt

Unit 2 Bonding and structure

All bonding booklet answers

All bonding booklet

Covalent mastery

Groups in the periodic table (all)

Ions mastery ppt

Unit 3 Quantitative chemistry

Further quant answers

Further Quantitative Chemistry Mastery Booklet

Foundation quantitative chemistry booklet

Quantitative Chemistry Mastery Booklet answers

Quantitative Chemistry Mastery Booklet v3 with hints

Mixed practice answers courtesy of Claire Frere

Unit 4 Chemical changes

Chemical changes H

Unit 5 Energy changes

Energy changes answers

Energy Changes Mastery Booklet F

Energy Changes Mastery Booklet H

Energy changes mastery ppt

Unit 6 Rate and extent

Rates Mastery booklet answers

Rates of Reaction Mastery Booklet F

Rates of Reaction Mastery Booklet H

Reversible reactions mastery booklet answers

Reversible Reactions Mastery Booklet F

Reversible Reactions Mastery Booklet

Unit 7 Organic chemistry

Further Organic Chemistry Mastery Booklet answers (ongoing)

Further Organic Chemistry Mastery Booklet

Organic chemistry all ppt

Organic Chemistry Mastery Booklet ANS

Organic Chemistry Mastery Booklet F

Organic Chemistry Mastery Booklet

Unit 8 Chemical analysis

Analysis mastery

Chemical Analysis ppt

Further analysis mastery

Unit 9 The atmosphere

This one is a bit different as we have actually moved it into KS3, but the material is more or less the same so should still be helpful at GCSE.

Formation of the atmosphere worksheet

The Earth’s Atmosphere ppt

Unit 10 The Earth’s resources

The Earth’s Resources mastery ppt

Useful materials triple only mastery

Using materials triple only answers

Using resources mastery booklet answers

Using resources mastery booklet F

Using resources mastery booklet H

Required practicals booklet

See here

Featured post

Blog at WordPress.com.

Up ↑