Search

A Chemical Orthodoxy

Schools, Science and Education

Search results

"curriculum"

Planning smarter: rethinking the short, medium and long term

A lot has changed recently. The new emphasis on curriculum and knowledge have led many to think about their teaching in completely different ways. For me, one of the big shifts has been the move from my teaching being resource- or activity-led to being content led. I used to do group activities, discovery learning and other techniques simply because I thought they were good techniques: not because I thought they suited the content I was due to be teaching. Now, I try to think about the what of my lessons first, and the pedagogy second.

One of the ways in which this change has manifested has been in planning and how to think about progression. My planning used to be in three main layers: the short, medium and long term. Obviously, those terms don’t mean anything beyond the relative time period associated with them: what actually gets done in each of them, as well as the actual amount of time involved, is open to interpretation. Below, I’ve tried to describe what these terms used to mean to me, the problems with how I thought about them, and what I do now, both myself and with our new KS3 course.

Short term

Here, the lesson plan was king. During my PGCE I was trained to write minute-by-minute guides to what I was going to do in the classroom, with carefully written descriptions of what I would do and what the students would be doing. Even then, I realised that this was silly. Teaching is about responding and adapting to the class that’s in front of you. Putting a time limit on yourself doesn’t make sense, what if you need more than 4 minutes? What if you need less? You need to be able to adapt.

A further problem is the very arbitrariness of using a lesson as a period of time for useful planning. Some schools have 50 minute lessons, others have 65. Those decisions are made for whatever strategic reason, but they certainly aren’t made because Adam’s lesson on polymers needs 50 minutes but not 65. Learning isn’t so neatly divisible. Some topics take more time, others take less time. Trying to stretch or squeeze the time it takes to learn something into an arbitrary time period makes little sense and does violence to your subject matter.

I appreciate that the way that novice teachers plan is different to the way that expert teachers plan. But that doesn’t mean that the Lesson Plan is the answer. It’s a relic of a bygone age and without a doubt we need to think a lot harder about how we support novice teachers to plan lessons.

Medium term

This is what medium term planning used to look like to me:

Slide1

In truth, this isn’t so different to how a file would look to me now, but I think the problem here is that it is just kind of an extension of the short term lesson plan. A medium term plan has to be coherent and well-sequenced. It needs to really anticipate how the ideas build on each other sequentially to form a better picture. I don’t know if I used to do that, to think in those terms, or if I just lurched from one lesson to the next without much thought. Take a look at this unit for example:

slide2.jpg

The “messy folder” will probably be familiar to many. Messy folders aren’t just something quirky that only obsessives worry about: they result in teachers not knowing precisely what to teach, and instead just going through the folder to find a resource that is teachable (I wrote about that here). That’s a big problem, but even if we removed the problem and the folder were neat and tidy there’s an issue with the topics here. Endothermic and exothermic reactions have nothing to do with ceramics, polymers and composites. They are completely different topics from completely different areas of chemistry. There is no coherence to putting them in a unit together, they’ve just been shoved together for whatever reason. Sometimes this is done for reasons of time, that it doesn’t make sense to have a short unit so you have to put things together. But it’s so obviously wrong that it betrays a weakness of curricular thinking. Units need to be logical and coherent: doing otherwise sends completely the wrong message about how your subject is structured.

Long term

This is what the long term used to look like to me:

slide3.jpg

I have big problems with this. Sure, it’s both pretty and colourful, so passes the “would my assistant head be happy with this?” test. But I don’t understand how it can actually work. Any time I’ve worked on a timetable like this I’ve fluffed it up. I’ve either been ahead of the plan, or behind. When ahead you end up doing momentum-killing revision or research lessons, when behind you end up rushing material just to catch up. I just don’t get how it works, or how you can predict with that level of accuracy how the year is going to progress. Things happen in school the whole time which can throw off these lovely timetables: visits, trips, drop-down days…they all conspire to completely ruin your plans. The veteran teachers just ignored the plan and got on with it, which was fine for their students there and then, but meant we got to the end of the year and different students were in different places: hardly a ringing endorsement of your department’s long-term planning.

So…now what?

I’m really lucky to work for a great team. We’ve tried to do a lot of things differently, and one semi-conscious choice has been around planning. Below, I will try to describe what we do and I’ll start with the biggest picture and then zoom in

Long-term

This is our overall plan of what we want students to know by the end of Key Stage 3. So not the end of the year, but the end of three years. There are a lot of considerations as to what makes it in and what doesn’t, but once set, it’s in stone. All students will know this material by the time they turn up for GCSE in year 10. That’s the goal: that’s the super-long-term.

In theory. In truth, we can’t really know how long things will take. So instead of saying “ah you must have finished B3 by December the 7th at 12.05pm, we have a plan that looks like this:

Slide4

The content takes the lead – not how long it will take. These are the topics, this is the order at which we are going to work through them and they will take however long they take.

Of course, we do still need to keep people roughly in the same place. We can’t have classes finishing the year an entire unit ahead or behind others. To address this, every six weeks or so, we do a checkpoint where everyone writes on a spreadsheet where they are up to with their class. In almost all cases, if a class is “behind” we slow everyone else down to that point. Likelihood is that teacher is just being more thorough. So the other groups slow down by doing more mini-quizzes or spending more time giving feedback after them or doing more practice work. Things which are good both in a learning sense, and in a strategic sense.

Part of the long term plan is also the sequencing of topics. I wrote more about this here, but for example we do C1 first so that we can introduce symbols and equations, then we do P1 so we can apply that to a combustion reaction and then do B1 so students can learn about chloroplasts and mitochondria in terms of energy stores, transfers and chemical equations.

Medium term

This is the unit. How are we going to build knowledge in a way that is coherent and conducive to student understanding? How are we going to relate it to other topics taught? We might do this in chunks like 1. The Solar System, 2. The Earth, 3. The Moon or whatever, but those don’t represent one lesson. You might spend one and a half lessons on The Solar System, a quarter of a lesson on The Earth and then five lessons on The Moon. It doesn’t matter: what matters is that the material is learnt well, however long it takes. And if you are halfway through teaching The Moon and you realise there is something from The Solar System (or a different unit entirely) that your students don’t understand? Go back, do it again. No other way.

Short term

This is the simple explain–>practice–>review cycle. Not a Lesson Plan – it’s just planning for learning. So I might want my students to learn that the Moon has phases because of its position relative to Earth and the Sun. I’ll think about the best way to explain that, how to support students to progress towards independent practice and then how to go over their work properly. And again, it takes however long it takes. I spent 45 minutes recently with my year 7s explaining the difference between mass and weight and how we can use that to construct an equation to relate weight, mass and gravitational field strength. It was a further one and a half lessons before they were practising using the equation completely independently. I had no idea in advance it would take that long – it’s just the amount of time that the content required.

That’s all folks

So there we have it. No doubt, as with all my posts, someone will pop up to say “that’s how we’ve always done it” or whatever. That’s cool, I was very careful to couch this as my experience. I’ve got a hunch that I’m not the only one though.

What to do after a mock? Assessment, sampling, inferences and more

A common question in the #CogSciSci email group is what to do after students have done an assessment or a mock. Most commonly, people spend a lesson “going over” the paper, where the teacher goes through each question and students make corrections. There’s often some accompanying document for students (or teachers) to fill in tallying their errors in particular sections. Highlighting is normally involved. Personally, I don’t think this approach is hugely beneficial (for the time spent on it) and below I will explain why I think this and conclude with what I do.

Student psychology problems

The first thing to note is what is going through the students’ heads when you go over a test. Likelihood is, they aren’t really paying attention to the feedback, and are more focussed on the grade or score they got. In my experience this is because they are any of:

1) just plain upset and demotivated (“I can’t believe David did better than me”)

2) focussed on finding a way to scrounge another mark (but sir that’s exactly what I wrote!) or

3) so smug about their score that they don’t feel the need to pay attention.

This even changes question by question: if half the students got a question right, as soon as you start going through it unless they are the best students ever they just won’t be listening to you any more. As such, you will have a lesson where not all students are thinking throughout (1).

Sampling the domain

There is a more significant objection, which starts with a concept from assessment theory: sampling and inferences. Let’s look at this image, to represent all the things we want our students to know about a subject (the domain):Slide1

The black spots are “things you want students to know” and the lines between them are “connections.” So one black spot might represent “sodium has one electron in its outer shell” and a line might take it to a spot which states “when metals form ions they lose electrons from their outer shell.” If I ask a student “what happens when sodium becomes an ion?” I am testing those two black spots and the line between them.

We want to know how many of these spots and lines students have in their heads, but we simply cannot test whether or not a particular student has learnt everything in the domain: the tests would be too long (2). So instead, we pick some specific things from the domain and test those:

Slide2

If a question asked: “what happens when sodium forms an ion?” we are testing two green spots and a connection between them, as above. The important bit comes next: the inference. If a student can adequately describe what happens when sodium forms an ion, I infer that if I had have asked them about say what happens when magnesium forms an ion, they would have got that right too. If that inference is justified, then it is valid. An invalid inference might be “this student knows what happens when transition metals form ions,” because the rules are different. The rule that governs sodium is the same one that governs magnesium, but the one that governs transition metals is different, so that inference would be invalid. The image below shows the things I tested in green, and the things I infer the student knows in orange:

Slide3

So the purpose of the test is not to figure out just if a student knows the things on the test, it’s to figure out how much of the stuff that wasn’t on the test the student knows.

Assessment maestro Daniel Koretz puts it like this:

“most achievement tests are only small samples from much larger domains of achievement. For example, responses to the small sample of items on a mathematics test are used as a basis for inferences about mastery of the mathematics learned over one or many years of education.

In this way, a test is analogous to a political poll, in which the responses of a relatively small number of people are used to estimate the preferences of a far larger group of voters. In a poll, one samples individuals, while in a test, one samples behaviors from an individual’s larger repertoire, but the fundamental principal is the same: using the sample to estimate the larger set from which it is drawn. For this reason, scores are only meaningful to the extent that they justify an inference about the larger whole. That is, the validity of inferences depends on the extent to which performance on the tested sample generalizes to the much bigger, largely untested domain of achievement.”

This is also the key to understanding your mock. If a student got a particular question wrong, you don’t want to spend time in class going over that particular question. The whole point of that particular question is to give you an inference over something else – the extent to which the student understands the domain. Going over that question means you are going over the sample, when in reality you want to go over the domain. If 30 students get a question wrong, don’t go over that question. Go back to the domain that the question came from, and go over that.

There’s a simpler objection here too: it takes a long time to learn things. To my shame I only really realised this a couple of years ago, but without regular review and retrieval, things won’t get learned. So if students get something wrong in a test, going over that thing once in class is not a good call. Go back to the domain. Think about what their wrong answer tells you about the stuff that wasn’t in the test. Then do that. And do it again in a few lessons time. Add it into a retrieval roulette – whatever, just make sure you do it again.

Curriculum as a progression model

You may have heard people (including Ofsted) describe the curriculum as the progression model. The meaning of the phrase ties into what we were looking at earlier; as soon as your students start studying your domain, a long term outline might look like this:

Slide4

Certainly, it’s probably not as neat as that, and the way you have planned to cover your domain might look a bit more like this:

Slide5
Curriculum planning is messy because domains are messy. This is an over-neatification.

The point is, your students are progressing through the domain. Whether or not they are “making progress” right now is to answer the question: how much of what I wanted them to learn this year have they learnt? As I’ve argued before, you cannot make any sensible decisions about pedagogy, teaching and learning or assessment until you have defined your domain and thought about what progress through it looks like. The test that you give your students measures progress in the sense that you infer from it the extent to which they have progressed through the domain.

GCSE grades are not a progression model

You will often see schools grading students from year 7 by GCSE grades, so they might be a 2 in year 7, 3 in year 8 etc, all on a flight path to get a 6 at GCSE. This is wrong, because grades are based on the entire domain: you get to the end of GCSE, this is what you are supposed to know. We find out how much of it you know as a percentage, compare you to other students and give you a grade. So in year 7-11 you aren’t progressing up the grades because the grades only work when applied to the entire domain. Using a GCSE grade on any one of the circles above is ridiculous, because the grade is a measure of the whole thing, not of one of the smaller circles.

Question level analysis (RAGging)

In Koretz’s passage above, we said that assessments are about inferring a student’s knowledge of the domain based on their performance in a sample of the domain. Whether or not you can “generalize” like this depends on other variables involved as well as student knowledge. The layout of the question, the amount of text, whether there is a supporting diagram, the stakes associated with the test, the time of day, whether or not the student slept or had breakfast or just got a message from their girlfriend or whatever. Those variables need to colour my inferences, and the fact that I don’t – and can’t – adequately assess their relative impact on the student means that I need to be very hesitant with my inferences. Triangulation with other data helps, longer tests helps, tests designed by someone who knows what they are doing helps, standardising the conditions helps and so on and so forth.

The effect of these other variables is what makes Question Level Analysis (where you look at the marks students got question by question) a highly dodgy enterprise. If students got a question on electrolysis wrong and one on rates right, that doesn’t mean you can infer they know rates and don’t know electrolysis. It could be that the question on electrolysis was just a harder question, or that the mark scheme was more benevolent, or that its layout on the paper was worse or it had more words in it or whatever. It might be that the question on electrolysis is incredibly similar to one they did in class, but the one on rates was completely new. You just can’t really know, so going crazy at the question level doesn’t strike me as sensible.

A further curve-ball is the “just right” phenomenon. It often happens in science (don’t know about other subjects) that a student manages to get marks on a particular question, but only just. They’ve said something which is awardable, but to the expert teacher it’s pretty obvious that their understanding is not as strong as another student who also got all the marks on that question. This further destabilises QLA as an enterprise. Still, you get some pretty coloured spreadsheets, and your assistant head for KS4 will be really impressed with how you are going to “target your interventions.” (3)

Grade descriptors are silly

A further ramification of all of the above is about grade descriptors and worrying about things like AO1, AO2 and AO3. Let’s look at an example first.

In the reaction between zinc and copper sulphate, energy is released to the surroundings and the temperature increases. If you stick them in a beaker together with a thermometer you can record that increase.

A common question would be to suggest an improvement to this experiment, with the answer being to put a lid on the beaker. If student A gets it right and student B gets it wrong, what inference can be made about their wider knowledge? Can I assume, as per grade descriptors, that student A is better at this:

“critically evaluate and refine methodologies, and judge the validity of scientific conclusions”

GCSE Grade Descriptors

Obviously not. I can’t infer from their performance on this question that they would be better able to suggest an improvement to an experiment about rates, acids or electrolysis, because that’s just not how it works. Put graphically, if we look at the below, the area in the red circle represents the topic “acids” and the area in the blue represents the topic “energy”:

ImagesEach topic might contain a spot (given in green) which is about an experiment: an experiment about acids (red circle) and an experiment about energy (blue circle). But if we only test the red experiment, we can’t infer that they know the blue one, because they are in totally separate domains. Sure, they have in common that they are “questions about evaluating and refining methodologies” but they also have in common that they are “questions made up of letters and words” – just because they have one property in common doesn’t mean you can infer knowledge of one from the other.

The same is true of stressing out over whether a question is AO1, 2 or 3. It may be the case that question 5 in the paper is AO2, but student A getting it right isn’t necessarily “good at applying knowledge” and student B getting it wrong isn’t necessarily  “bad at applying their knowledge.” You can’t infer that, because you are essentially saying that if question 6 is also AO2, then student A is more likely to get it right than student B. That would be ridiculous, because it depends on the actual domain – the knowledge that is being tested – not other properties that questions 5 and 6 have in common. If question 5 is about acids and question 6 is about energy, then the students’ knowledge of energy is going to determine how well they can answer question 6, not their “ability to apply their knowledge of acids in unfamiliar contexts.”

Images 2
You can’t predict anything about a student’s response to question 6 based on their response to question 5. This is why feedback of “improve AO2 response” or “get better at applying your knowledge” isn’t meaningful.

What I do

As I’m marking student tests, I’ll look for glaring common errors. I address these on three levels:

1. This class, right now

Before I’ve given the papers back, I might spend a little time going over a couple of big issues. But I go back to the domain, so I don’t say “right there was a question about sodium bonding with chlorine that we got wrong so we’re going to do that again” I go right back to when I first taught students about bonding and say “there were some confusions about how electrons are transferred in ionic bonding so we are going to look at that again.” I teach a bit, give students some practice and then move on.

2. This class, in future

More importantly, I want to make sure this class have the opportunity in future to come back to this. So I might pencil in for next week to do a retrieval starter based on electron transfer, or build it in to a worksheet they are going to be doing next lesson.

3. Future classes

This is probably the most important level. I’m going to be teaching this content to three new classes next year. What am I going to do now to make sure that Future Adam spends a bit more time on this next year, so Future Adam’s students know this domain better this time next year? This is part of the reason why SLOP is powerful: I edit and change the booklets as soon as I’ve taught and assessed a unit, so I can make them bigger and better for the next group (as well as being used later as revision for this group).

Some do’s and don’ts

I realise I got a bit carried away on this post, so here’s a summary. Even if you to decide to do a don’t or don’t a do, I hope it’s a useful framework to think about assessments:

Don’t:

  • Spend hours in class looking at the sample rather than the domain
  • Go crazy looking at the marks students got for each individual question
  • Give GCSE grades for individual assessments
  • Use GCSE grades as a progression model
  • Don’t infer anything about students’ general abilities like “evaluative thought”
  • Go crazy over the AO of a question

Do:

  • Think about what your students are thinking about (or not thinking about) when you are going over a test
  • Go back to the domain
  • Plan to revisit the domain
  • Be sensible about what you are inferring about your student’s mastery of the domain
  • Treat the curriculum as the progression model

 

There is a follow-up to this blog which looks at how you can actually implement these ideas here


(1) see here for more on that

(2) According to assessment boss Deep we’re miles away from even thinking about testing an entire domain. He recommends reading these

Kingston, N. M., Broaddus, A., & Lao, H. (2015). Some Thoughts on “Using Learning Progressions to Design Vertical Scales that Support Coherent Inferences about Student Growth.” Measurement: Interdisciplinary Research and Perspectives, 13(3–4), 195–199.

Rupp, A. A., Templin, J., & Henson, R. A. (2010). Diagnostic measurement: Theory, methods, and applications. New York, NY: Guilford Press

Not only would they be too long, but they would also probably not actually be particularly valid. It’s conceivable that a brilliant science student has mastered all the content, but in a different order and structure to how another brilliant science student did. A test which aimed to test each part of their cognitive structure might not account for individual idiosyncrasies in schema construction.

(3) see here for more on this

 

All the SLOP you need

All the resources below are published absolutely free of charge. If you want to say thank you or support me in some other way, click here

Chemistry *and* physics booklets

I have finally finished updating my booklets. The page where I used to keep them got a bit messy so I’m starting a new one here. In terms of changes, I have fiddled the sequencing on a bunch of them, fixed a load of typos and added a shed-load more practice. I’ve tried especially hard to add “interleaved problems” which is really a questions sequence that ties together a lot of different topics. You can find most of these at the back of each booklet.

The booklets are also organised slightly differently as we have changed our order of teaching, and we now teach topics roughly in the same order as they appear in the AQA spec (though material is common to all boards). I’ve covered basically the whole course, there are a couple of bits and pieces which I use other resources for but almost everything is there. I’ve put some powerpoints there too. I don’t really use these much anymore, so can’t remember if they are any good but they may help.

As ever, they are free to download and I hope you find them useful. All I ask in return is that you let me know if you spot an error or something that I could do differently. If you write answers I’d appreciate those too if you are willing to share. I’m also very interested in knowing how people use them so don’t hesitate to be in touch on that either. If you want to know how I use them, click here.

Please remember: these aren’t just for the students: they are for teachers too. I have tried to show how different units and topics can be broken down and sequenced logically. I’ve tried to show how you can build in techniques like retrieval and interleaving to promote better learning and if you’re interested more in that I suggest you check out #CogSciSci.

I do update them quite frequently so I wouldn’t advise downloading them all in one go, but coming back every time you need to teach something and re-downloading.

While you’re here hunting for resources, please feel free to have a look around at the rest of the site and if you are interested in science, teaching, curriculum, cognitive science and educational research do hit subscribe.

The fantastic @ChemMcDougall has been writing answers to the booklets and has very generously shared them here. Please make sure to thank her if you can.


Chemistry:

Unit 1 Atomic structure and the periodic table

History of atom + elements and compounds mastery answers

History of atom + elements and compounds mastery

History of the atom ppt

Unit 2 Bonding and structure

All bonding booklet answers

All bonding booklet

Covalent mastery

Groups in the periodic table (all)

Ions mastery ppt

Unit 3 Quantitative chemistry

Further quant answers

Further Quantitative Chemistry Mastery Booklet

Foundation quantitative chemistry booklet

Quantitative Chemistry Mastery Booklet answers

Quantitative Chemistry Mastery Booklet v3 with hints

Mixed practice answers courtesy of Claire Frere

Unit 4 Chemical changes

Chemical changes H

Unit 5 Energy changes

Energy changes answers

Energy Changes Mastery Booklet F

Energy Changes Mastery Booklet H

Energy changes mastery ppt

Unit 6 Rate and extent

Rates Mastery booklet answers

Rates of Reaction Mastery Booklet F

Rates of Reaction Mastery Booklet H

Reversible reactions mastery booklet answers

Reversible Reactions Mastery Booklet F

Reversible Reactions Mastery Booklet

Unit 7 Organic chemistry

Further Organic Chemistry Mastery Booklet answers (ongoing)

Further Organic Chemistry Mastery Booklet

Organic chemistry all ppt

Organic Chemistry Mastery Booklet ANS

Organic Chemistry Mastery Booklet F

Organic Chemistry Mastery Booklet

Unit 8 Chemical analysis

Analysis mastery

Chemical Analysis ppt

Further analysis mastery

Unit 9 The atmosphere

This one is a bit different as we have actually moved it into KS3, but the material is more or less the same so should still be helpful at GCSE.

Formation of the atmosphere worksheet

The Earth’s Atmosphere ppt

Unit 10 The Earth’s resources

The Earth’s Resources mastery ppt

Useful materials triple only mastery

Using materials triple only answers

Using resources mastery booklet answers

Using resources mastery booklet F

Using resources mastery booklet H

Required practicals booklet

See here


Physics:

The booklets below are adapted from Ruth Walker’s booklets which are here

6.1 TTA Energy

Featured post

The generic and the disciplinary: finding a balance

Yesterday, I posted a blog arguing that “teaching and learning” is dead. It generated some really fascinating conversations online, and I wanted to pick up on something a couple of people raised: it may be the case that curriculum comes first, and that it dictates pedagogy. And it may be the case that in the past there have been some fairly questionable outcomes of non-specialists observing specialists. But:

  1. Are there not some things like retrieval practice which we know are generally good ideas which even a non-specialist would be able to identify?
  2. Is there really no value to observing a lesson outside of your subject?

I think these are completely legitimate challenges to my premise, so I wanted to try and express my thoughts here. I’m not going to be able to be exhaustive, but will try and cover what I think are the most important points. I’ve been lucky enough to work with colleagues who are not scientists and below are the things that I try to do and have picked up from others more experienced than I am.

Ground rules

First, I think it is important to establish ground rules, and I have three of these:

  1. Show some humility

Observing outside your specialism requires humility. Do not go into that classroom thinking that you are the expert and that your poor beleaguered colleague is just desperate for the pearls of wisdom you have to offer them. You don’t know the subject, you probably don’t know the class, you don’t know what they studied yesterday, last week or last year and you don’t know the departmental priorities. Obviously, it is a good idea to have that conversation first and try and get a better picture, but ultimately you will be at a position of knowledge inferiority against your colleague. So show some humility.

2. Lower those stakes

If you are still using one-off lesson observations to give teachers a grade and feed into their performance management, you should stop doing that. Observations should be there for the formative benefit of both observer and observed, and it is worth looking at ways to make sure everyone is perfectly comfortable with the observation.

If you are looking to challenge school leaders around this (and you should) then I have a section here with some materials you may find helpful.

3. Ask the questions

I’ll go into this more below, but when you feedback you need to make sure you get your facts straight first. You might see something you don’t like at first, but then actually it makes perfect sense based on this subject, this teacher and this class – so make sure to ask and have an open conversation.

The general and the disciplinary

There are some things we know work, and we know expert teachers do. Retrieval practice, explanations in short chunks and plentiful practice all spring to mind. How do these tie in with the approach that says curriculum and disciplinary knowledge come first?

I think the answer is that it might be the case that I could walk into a history lesson and see retrieval practice in action, but I wouldn’t really know if that retrieval practice was any good. I wouldn’t necessarily know if the questions and answers were correct or even relevant to the curriculum being studied or these students’ place within it. I wouldn’t be able to notice if there was some aspect of language that should have been picked up on or if the feedback given was correct. An expert teacher might also like to start making connections between different topics, but they would only start doing that at a certain point in their students’ learning journey. Would I be able to know when the correct point was? Probably not.

I normally try and chunk my explanations into small pieces, but there are some things which I do in a more extended fashion. When I teach the hydrogen fuel cell, my explanation often lasts around 40 minutes. There’s a good reason for that – a disciplinary reason for that, and most people who aren’t chemists – let alone science teachers at all – might not understand why that is. Similarly, when I am in my subject’s hinterland, I will often talk for a lot longer and use completely different techniques to when I am in my subject’s core. A non-specialist might not notice the difference, so as above it’s important that there’s a sensible conversation afterwards.

Practice is a really important one too. My fellow science teachers will be familiar with the fact that there are now thousands of resources and worksheets out there of highly variable quality. So it could well be that a non-specialist can observe and see that my students have lots of practice, but would they be able to judge the quality of the actual work I have set, let alone the quality of their responses? To my shame in the past I have given out many a terrible worksheet that at the time I thought was great. But they aren’t always tied to the material, don’t cause students to think about the right thing, don’t tie to previous material etc etc. A non-specialist simply can’t judge the quality of that. We could have a great conversation about it afterwards where I could explain my thinking, but without that the process is deeply flawed.

Sputnik Steve (English teacher) once watched a video of one of my lessons for me and said that after my explanation I hadn’t properly checked for understanding. He was dead right: and I should have done, and his was great feedback. But let’s say I had done and sampled a few student responses – would he have been able to tell how good those responses were? Would he have been able to say “actually Adam, the technical language they were using was all wrong and they were full of misconceptions”? I doubt it.

Is there really no value in observing a lesson outside your subject?

No, there is value. But you need to know your stuff, and you need to understand that you are in a position of knowledge deficit. Often, the conversation afterwards – when done properly – can be incredibly illuminating. Whilst I don’t think I benefited much from watching an entire lesson in Spanish, I know that I have benefited from watching many lessons outside of my subject and from being observed by non-specialists.

So there it is. I’m not in the business of seeking balance for the sake of it, and in this case I do think that the weight is clearly over to one side. So perhaps it isn’t a balance which is sought, but certainly a nuance.

Please keep up the conversation, I’ve benefited immensely from it and I hope others have too.

Teaching and Learning is Dead

We’ve all been there: formal observation with a non-specialist. Being told that our AfL was sub-par, that our activities weren’t engaging enough, that we hadn’t appropriately differentiated for SEN, EAL, PP, G&T, HPA, LPA etc etc.

It’s incredibly frustrating to be told by someone who doesn’t know your subject that you are teaching it wrong. How can it be that someone who knows nothing about covalent bonding can tell me that my teaching of it is sub-standard because I didn’t progress up Bloom’s taxonomy? How can it be that someone can judge my sixth form marking when they cannot decipher the symbols and equations on the page?

It has somehow become an orthodoxy that such a thing is possible. That a science teacher can assess the quality of an English teacher who can assess the quality of a maths teacher who can assess the quality of a PE teacher and so on and so forth. I remember once being told to go and observe an “outstanding” head of MFL. The entire lesson was in Spanish, a language of which I habla nada (?). But it’s fine, because I was there to observe teaching and learning: a universal set of practices that could be employed in any classroom in any subject to achieve rapid progress.

***

Ofsted has its part to play here. I don’t know the exact history or much care, but a culture has developed around observing teaching and learning. From an accountability perspective it makes sense: you need some way of going into a classroom for a short period of time and saying whether or not the teaching and learning is effective. So you invent some kind of proxy that makes it easy to judge the quality of a teacher in twenty minutes, and the proxy has to be obvious and observable. So you end up with three part lessons, mini-plenaries every 90 seconds with whiteboards, traffic lights and thumbs up and down and students wandering around the class hunting for information so that you can see by their shining faces just how engaged they are.

***

I remember when I was training going to visit a local school with a brilliant reputation for teaching and learning. I sat in on staff briefing where a couple of hip young teachers had this thing where they went to the local pound shop and bought some piece of junk: a tupperware or a furry dice or whatever. They pulled a name out of a hat and a member of staff had to take the junk to use in their lesson at some point that week. After this, last week’s lucky winner had to, in front of all the staff, explain how they had used a purple macintosh as part of their year 12 Latin lesson.

It’s all a good laugh, sure. Staff bonding and all that. But it completely ignores something that’s pretty damn important: the actual subject. It says “right, how am I going to use this resource in my teaching?” instead of “ok which resources are most appropriate to my teaching?”

On the surface it sounds great, and it makes a staff feel like there is a real “buzz” around teaching and learning. But the buzz rings hollow, devoid as it is of any actual substance.

***

Like many others, my thinking on this has changed radically over the past few years. It’s only recently that one of my performance management targets was to embed more group work into my lessons, as if it were some kind of universal Good that would improve any of my lessons, regardless of the actual content. But I’ve been lucky enough to read some inspirational writers on the topic and have my thinking jarringly challenged. I’ve come to believe that the phrase teaching and learning simply won’t do any more. It carries too much generic baggage: too much of the tick box culture which has allowed non-specialists to tell me that I’m not teaching science properly. Most importantly, it starts from the wrong place. It starts the teaching, not the content. What is the point of talking about teaching unless you are talking about content?

***

Frederick Reif has a great graphic in his book which looks a bit like this:

Slide1

You start with two states: Student (initial) and Student (final). The change from your student as they are at the beginning (initially) to the student as they are at the end (finally) is called learning:Slide2

And, somewhat obviously, that learning is as a result of teaching: from a teacher, from a book, from a life experience – it doesn’t matter:

Slide3

But here’s the thing: the T bit of the diagram is the least interesting part of the process. Even the “learning” part of the diagram isn’t the most interesting, because that’s just a journey.

The interesting part is not the teaching or the learning, it is the difference between our initial student and our final student. What have they learned? How are they different now to before? To me, that’s what’s really interesting.

***

I talk here of course about curriculum: education buzzword of 2019 and beyond. The substance of what is to be taught. That which inheres in Student (final) but does not in Student (initial). The teaching and the learning are important, but they are tools, processes which are subservient to the curriculum. We do need to talk about them, but only insofar as how we optimise them to deliver the curriculum.

***

Ofsted’s draft handbook for September 2019 is a radical document steeped in this kind of thinking. To me, just as important as what is in the document is that which is not in the document. I talk here of the old heading “teaching and learning” or “teaching, learning and assessment;” phrases which have been excised from the 2019 draft. Instead of teaching and learning, we have a broad heading of “quality of education,” which splits into subsections. The one that interests us here is “curriculum implementation.” To me, this phrase rings the death knell of Teaching and Learning, because it’s so much better. Sure, it doesn’t sound as snappy and being Assistant Head with Responsibility for Curriculum Implementation probably won’t get you many followers on twitter, but it’s more correct. Because it puts the curriculum first. It says right, this is our curriculum, this is the difference between Student (initial) and Student (final): how are we going to implement it? How are we going to make sure our students make that learning journey?

The answer to these questions by definition is tied to the curriculum. I am implementing a science curriculum, and the way I do that is different to how my colleagues implementing their history or English or maths curriculum will do it. Because Student (final) is different in my subject to theirs. It’s all in the curriculum, and how I implement it.

You can’t come into my classroom as a Spanish teacher and tell me I’m implementing my curriculum wrong, because you don’t know my curriculum. And Lord knows I don’t know yours, so I’m sure as infierno not going to come into your class and tell you that you’re implementing your curriculum wrong. I wouldn’t have the faintest clue.

Schools will need to change. It isn’t good enough to rename the Assistant Head for T&L as the Assistant Head for Curriculum Implementation. Schools will need to mine the knowledge of their subject experts in a bid to understand what progress looks like in those subjects. To understand what a science, geography or D&T curriculum is, and how its implementation is carried out. To clarify the difference between Student (initial) and Student (final) in each and every subject which that student is immersed in.

***

I think teaching and learning is dead. I suspect that teaching and learning doesn’t know it’s dead, and I suspect that it will stagger on for many years to come. But it’s certainly time for it to be retired, for that generic chapter in our sector’s story to be closed.

Teaching and learning is dead: long live curriculum implementation!

 

You can find a follow-up to this blog here, which deals with what a non-specialist can do in an observation


My thinking on this has been greatly influenced by Chistine Counsell and you can find more of her writing on the topic here. I also have a list of things on curriculum to read here and I recommend Stuart Lock’s blog here. You my also be interested in the recent symposium on curriculum in science here.

Modelling Curricular Thinking: Inspired by Ben Ranson

I was just settling in for a well-earned evening playing video games on my laptop when I saw this thread by Ben Ranson:

The reason why Ben’s thread is important is because it models curricular thinking. Most of us (including myself) are not trained to think deeply about curriculum, and in this bright new era of curriculum, curriculum, curriculum, we need models – like the one Ben has provided – to stimulate our own thinking and provoke us to re-evaluate the what, why and how of our subject. The more models we have, the greater chance that this new era indeed shines bright as schools and leaders take ownership of their curriculum instead of outsourcing their thinking to consultants, or drowning it in endless generic bureaucracy.

I therefore wanted to share some of our thinking too. I’ve written a lot about our new Key Stage 3 curriculum, but what I want to do here is give one example of how we have thought about the actual disciplinary substance of our curriculum and how we have thought about the sequencing of core concepts. Throughout I will try to use terms from Ruth’s blog on the language of curriculum, which I recommend you read, and will signal by bolding the term.

The problem: where to start?

Where do you start with year 7? You know you want to teach them biology, chemistry and physics, but which comes first? And which topics from within those broader disciplines do you introduce first?

The easy one: chemistry

Chemistry is probably the easiest to figure out what to start with. That’s because even though chemistry can branch off pretty rapidly into horizontal areas, everything in chemistry stems vertically from a set of fundamental principles: atoms, elements, compounds, molecules, conservation of mass and equations. Without that, you can’t study anything in chemistry. So even though you could certainly study rates of reaction without studying fractional distillation, you would be hard pressed to study any of those without those fundamentals.

Image 1
Theoretically, you can teach rates of reaction, organic chemistry and acids/alkalis independently of each other. You wouldn’t actually do this, but you could. But you can’t do any of them before students have the fundamentals pat.

Actually figuring out what to include was the tough part, as many standard textbooks and schemes of work use confusing and self referential definitions like:

Atom: the smallest part of an element that can exist

Element: a substance made of one type of atom

Which doesn’t really help anyone. You can see how we have built our unit in its entirety here.

Getting tougher: biology

Biology tends to be a bit less structured like that. Topics obviously do relate to each other, like interdependence and adaptation, but there isn’t really a central pillar that everything rests on: the fundamentals of chemistry can be thought of as the basic grammar of the subject – the way that scientists talk about chemistry is through that language – but biology doesn’t quite have the same thing. We decided to go with cells for a few reasons:

  • The language of structure and function is definitely part of the grammar of biology
  • It’s quite mind-blowing
  • It has a strong hinterland in terms of the history of the microscope and the way we think about life (see here for more)
  • It foreshadows many other topics like organ systems, circulation, digestion etc etc.

The downside is the sheer amount of content in the topic. In order to do it properly, we wanted students to study five specialised cells in detail as well as an entire organ system and how a microscope works. That represents quite a lot of material, so we didn’t really want it to be the first unit year 7 studied: we wanted to get them into good habits of memory like regular retrieval practice before throwing an enormous amount of declarative knowledge at them.

Physics – my head hurts

Physics is even harder to figure out, because even though the topics link to a massive extent, there has to have been some serious epistemic ascent before the pieces really start to come together. This is partly because physicists understand the world through many different lenses and perspectives, depending on the problem at hand. If even undergraduate physicists can struggle to untangle and properly assimilate topics like forces and energy we need to be very careful about how we sequence the knowledge. Probably the most common plank between topics in physics is mathematisation and the use of formulae, with another contender being the concept of using abstract models to understand the world.

We decided to go with energy stores and transfers first. The reason for this is:

  • It is an archetypal way to demonstrate how physicists use abstract models to interpret the world
  • Even though the content is abstract we could use a lot of concrete examples
  • We thought we could tie it into the the greatest number of future units
  • Even once students have “got” the concept, using the correct language takes time, so we could revisit it regularly
  • It can build up to vitally important topics like renewables and energy economies
  • We can use P=E/t to introduce formulae

We built the unit to cover the stores and transfers model fully and lead up to a few exemplars: the coal power plant, wind turbines and hydroelectric. And we drilled the stores and transfers so well, that by the time we got to the power plants it wasn’t a confusing mess like they normally are when you teach them, but a really straightforward application of knowledge students were already fluent in.

We also didn’t want this physics unit to come first. We felt it would be a really tough unit for the students and teachers should be at the point with their classes where they know them well and are more adept at supporting them through it.

So which comes first?

Following the logic above, we have chemistry fundamentals first, and then either cells or energy stores. We decided to do stores first, based on the following logic:

  • When teaching cells, you have to teach organelles
  • “Organelles” includes mitochondria and chloroplasts
  • So you also have to mention respiration and photosynthesis
  • To understand these properly, you need to understand chemical reactions and energy transfers
  • In the past, these had appeared as bounded – definitions that students just had to learn because they had no proper knowledge of reactions or energy
  • If we did the chemistry and physics first, these central planks of biology could be properly understood

An added bonus of doing chemistry then physics was that when we got to the coal power plant, we could talk about combustion properly. We could do proper equations and show students how the chemistry finds itself in the physics. We could then build up to respiration and photosynthesis in the same light: reactions that are either analagous or the complete reverse. This would be crucial for our students’ schema development and understanding of the links between the topics.

This leaves us with a sequence that looks like this:

  • Chemistry fundamentals, including proper understanding of equations
  • Energy stores, including proper understanding of combustion in physics
  • Cells, including proper understanding of photosynthesis and respiration

That’s it for now. Would love to hear your thoughts, and would especially love to see other models like this across the subjects.

Core and hinterland: What’s what and why it matters

In 1918, the Nobel Prize for Chemistry was awarded to a war criminal.

In the early years of the 20th century, German scientist Fritz Haber developed a process to artificially synthesise ammonia, a vital component of agricultural fertilisers. A reaction that changed the world, his process drove a ballooning in industrial agriculture and, with the fullness of time, allowed for a population explosion and the pulling of billions of people out of poverty.

But Haber’s oeuvre extended from the globally beneficial to the sinister. A fervent nationalist, in World War I he turned his brilliance to the German war effort and pioneered the use of chemical weaponry on the battlefield, personally supervising the first administration of deadly chlorine gas in the trenches of Flanders.

Despite these contributions to the Fatherland, Haber was forced to leave Germany because he had Jewish ancestry: an ancestry he despised. In a grimly ironic turn of historical events, the laboratory which he had headed went on to be instrumental in the production of the chemical Zyklon B, a chemical used by Hitler’s SS to murder hundreds of thousands of Haber’s own people in the gas chambers at Auschwitz.

***

The Haber Process has been on the GCSE Chemistry curriculum for many years and every year when I teach the process, I tell Haber’s story. As a Jew and as a teacher of science, it’s important to me and it serves as the most extreme of cautionary tales about the role of science in modern society. But I don’t expect my students to remember its details. I don’t have a knowledge organiser chronicling its events, or expanding the discussion to Haber’s tragic family life and the suicides of his wife and son. There are no assessment questions in our end of unit test asking students to evaluate the significance of science’s contribution to World War I. I set no drill questions on the viability of gas as a weapon of war.

***

When discussing curriculum, Christine Counsell presents a paradox at the heart of curricular choice: on the one hand, we all know that there is content which we wish students to remember, and by contrast content we cover in class which we don’t deem necessary for them to remember. This would ordinarily lead us to de-emphasise the latter in favour of the former. The other hand of our paradox though is that without that material, without the “stuff we don’t need our students to remember,” our curriculum becomes denuded of wider meaning and majesty: it ceases to be one thread of the epic story of humanity and becomes a sterile and sanitised exam-ready product.

To aid us in thinking about this paradox, Counsell posits the use of two terms: core and hinterland. Such terms are not carved-in-stone categories, delivered by God to Moses at Sinai. They are intellectual devices which should serve as a prompt for us to reflect on, and clarify, our curricular decision-making.

I think of core as the stuff I want my students to remember and to stick in their long term memories: all the details and propositions that make up the cognitive architecture of a creative and innovative scientist. Hinterland is how I frame that knowledge: the stories I tell and the examples I use. It’s the ground from which the core springs. So in my example, the core includes the equations for the Haber process, the effects of changing conditions on the equilibrium, the idea of a compromise condition and so on and so forth. The tale is how I frame it. Haber’s process is core, Haber’s story is hinterland.

***

This is an incredibly powerful distinction to be made when thinking about science, and indeed all, curriculum. When we are teaching, what is it specifically that we want students to remember, and what is it specifically that we use to make it memorable? I can imagine this question cropping up across the sciences and across the key stages, and I would like to try and explain why ignorance of it may be responsible for some curriculum faux pas.

We know that an exam specification is not a curriculum, but let’s take a fairly common section from the GCSE syllabus: the history of the atom. The spec goes into quite a bit of detail about the different models of the atom proposed by various scientists, and the experiments that moved them from one model to the next. In the AQA syllabus the entire section is prefaced by:

New experimental evidence may lead to a scientific model being
changed or replaced.

Ah! To me, that’s core. An incredibly important point, perhaps the most fundamental concept in the disciplinary substance of science. Evidence is king, and we are but servants before it. Beautiful.

But I would argue that using the history of the atom to illustrate it is hinterland. Is it really important that GCSE students know about how alpha scattering proves that the plum pudding model is wrong? Or that they know the names of Niels Bohr and James Chadwick specifically? Why these chemists? Why are these specifically the only named chemists in the entire specification? Are these the best examples to discuss how experimental evidence leads to scientific models being changed or replaced? What about phlogiston and the conservation of mass? What about Grecian Classical Elements? What about the now-excised-from-the-curriculum theories of continental drift or land bridges? What is the core here, and what is the hinterland?

Further, there is perhaps even an incoherence at play here as we compare the following statements:

The results from the alpha particle scattering experiment led to the
conclusion that the mass of an atom was concentrated at the centre
(nucleus) and that the nucleus was charged. This nuclear model
replaced the plum pudding model.

Niels Bohr adapted the nuclear model by suggesting that electrons
orbit the nucleus at specific distances. The theoretical calculations of
Bohr agreed with experimental observations.

It looks to me like students need to know the details of the alpha particle scattering in a way that they don’t for Bohr’s experiments. But why? Why do they only need to know that his calculations agreed with observations, a fact that is true, but surely not particularly exciting, powerful or far-reaching? The same is true of later developments, students are just expected to know that they occurred, but not to know why or how those developments came about:

The experimental work of James Chadwick provided the evidence to
show the existence of neutrons within the nucleus. This was about
20 years after the nucleus became an accepted scientific idea.

If we were writing from scratch, and we had core and hinterland in mind, would we make these curricular decisions? I’m not convinced we would.

Let’s look at another contender for hinterland: “real world” applications of scientific principles. Extraction of aluminium is probably a good one, where we expect students to learn that in the industrial extraction of aluminium, graphite anodes need to be replaced by factory owners as they react with the oxygen by-product of the electrolysis of aluminium oxide. Honestly, who cares? Does it really matter? Do GCSE students really need to know it? Is it really core? I can certainly imagine that there is a chemistry teacher out there who used to work in that industry, and probably tells their class stories of how they needed to replace the graphite electrodes. I have no doubt that I would enjoy being in that teacher’s class and that the hinterland they prepared was fertile for the planting of more fundamental and further reaching ideas. But I am not that teacher, and that hinterland is not the land that I would choose at the best of times: and now I am being tasked with calling it core.

To me, this represents a deficit in our curricular thinking – a failure to appreciate a vital distinction. Briefly, I would like to discuss two further ramifications of this distinction: scientific “application” and how pedagogy changes depending on whether that-to-be-taught is core or hinterland.

Application

It would be quite easy to think of “application” questions as hinterland. For example, if a student is asked why the mass of magnesium increases when it is reacted with oxygen, it would be straightforward to think that the core knowledge here is “the law of conservation of mass,” with my hinterland being “isn’t this a cool example? The metal’s mass actually goes up when I burn it into this crumbly white powder!”

I’m sort of in two minds about this. On the one hand, sure. It’s just one example of a principle that works across the whole of chemistry. On the other hand, can you learn that principle without examples? Is the example a bit more than just an embellishment but a concrete scaffold, a fixed point in space that navigates us towards the broader principle? If that scaffold were taken away, would the core concept still be retained? I don’t know, and we decided in our KS3 curriculum that it would be a core idea, and we expect students to memorise how and why the mass of magnesium changes. Even more so, we use it as a “canonical example” in our How Science Works unit: because students know the principle well, it allows us to use it as hinterland in our discussion of theories, evidence and scientific conclusions. Yesterday’s core is today’s hinterland.

I’m not sure there’s a ready answer to this, but it underlines the need for us as a disciplinary community to discuss and establish the parameters and justifications for our curricular decisions.

Pedagogy

Looking back, I find that my classroom craft changes depending on if I am teaching core or hinterland. Core is always straightforward: I break a topic up into small pieces, I use a lot of boardwork, ask a lot of questions and students do a lot of practice. I don’t move around the class too much, I try not to be too dramatic and try not to vary my intonation and speech patterns. I use technical but unembellished and prosaic language.

Quite the opposite tends to be the case when I walk in the hinterland. I move around the class more, I become physically animated and visibly excited. I vary my intonation and use poetic and emotive language. I can often talk for a long time without pause, without asking questions and without students taking notes or doing drill questions. I draw on my personal feelings and experiences in a way that I rarely do in my other interactions with students, I give just a little bit more of myself. When discussing Haber I talk about the pain I felt as a chemist when I stood in the gas chambers at Majdanek and saw the vivid blue stains on the walls, knowing it to be “prussian blue,” a characteristic residue of cyanide containing compounds. Standing in the hinterland is just…different. I know it’s different, the students know it’s different, and it serves to thoroughly underscore – to weave into the very fabric of our education system – that curriculum must precede pedagogy.

***

Core and hinterland are not fixed, firm categories. As above, what may be hinterland one day may be core another day. Haber’s story might be hinterland in science, but could be core in History, or in the History of Science. They aren’t bounded terms with indisputable meanings and parameters. They are tools for reflection and deeper curricular thinking. If we turn them into non-negotiables in curricular planning, and demand a central document for every faculty detailing which concept is which we will have missed the point.

I’ve tried above to show how this tool has influenced my thinking, how they have pushed me to consider my content and my teaching in a different light. I think that’s the spirit in which they should be used.

Much of the above is exploratory and, as I have said before, I am just starting out in my journey of thinking deeply about curriculum. But I doubt I’m alone. We need, as a subject community, to discuss this. We need, as a subject community, to utilise the tension and the paradox to grow intellectually, to sharpen our discourse and to reflect more meaningfully on what is indubitably the most important part of our teaching: the curriculum.

***

The above is my contribution to the Curriculum in Science Symposium. Our biggest symposium yet, I encourage you (scientist or no) to read the other contributions, links to which can be found here

I am incredibly grateful to Christine Counsell for providing me with feedback and guidance in writing the above. She is truly a force for good in our education system, and I am glad that the winds are finally beginning to shift in the direction she has advocated for many years.

 

The molecular Biology of a PGCE course – Dr Andrew Carroll

Below is Dr Andrew Carroll’s  contribution to the Curriculum in Science Symposium. See here for the introduction to the symposium and links to other contributions.

In this brief paper I will attempt to illustrate how, in my role as a PGCE tutor, I have structured a PGCE course around two central ideas:

  • Learning to teach does involve the acquisition of a specific type of knowledge.
  • The knowledge for teaching can be regarded as being organised into ‘schema’ which are adaptable.

I completed my PGCE in 1996, so of course I was introduced to the ‘holy tryptic’ of Bruner, Vygotsky & Piaget. I do not regret this experience. Amongst the things I remember most from those heady days is the concept of ‘schema:’

This belief has been galvanised quite recently by reading F.C. Bartlett’s book ‘Remembering’ for the first time, which I found serendipitously abandoned in a skip:

“Circumstances which arouse memory orientations, whether they occur in the laboratory or in everyday life, always set up an attitude that is primarily towards a particular ‘schematic’ organisation.” (Bartlett, 1932 p313)

Ghosh and Gilboa (2014) offer a fascinating history of the development of the idea of mental schema, suggesting that any schema must conform to four necessary factors.

To support my argument that the knowledge for teaching can be regarded as schematic. I will consider each of Ghosh and Gilboa’s necessary factors in the context of a PGCE course.

Associative Network Structure.

I would suggest that the knowledge required to teach science is made up of interrelated units; the content, pedagogical and contextual knowledge required to teach effectively. I will go on to present a metaphor I use with PGCE biology students, also explaining the ‘molecular biology’ of my title.

Has a basis of multiple episodes.

The majority of learning on a PGCE course happens through experience. The importance placed upon ‘reflection’ on/in experience in ITE could be interpreted as searching for commonalities across events. As Bartlett (1932) suggests “the past acts as an organized mass rather than as a group of elements each of which retains its specific character.” Cited in Ghosh and Gilboa (2014)

Lack of unit detail.

If a schema is based on multiple episodes, then it follows that generalisation will be required. It is interesting to consider this in relation to teaching science, where in terms of content knowledge, an awful lot of unit detail is required. I believe this to be the liminal space where content and pedagogy collide.

Adaptability.

I believe there is no doubting that if there is a ‘schema’ for teaching science then it is adaptable. On a time scale ranging from the sub minute to years, teachers rely on schema which are constantly being adapted. I believe that adaptive schema provide the mechanism for assessment for learning and responsive teaching.

Of course, the notion of ‘schema’ can only ever be a metaphor; we will perhaps never know ‘what is really going on’, as a human forms memories and acquires knowledge and understanding in response to environmental stimuli.

In latter times, I think @olicav  has provided one of the best visual metaphors for the role of schema in the classroom.

ac1

As Lakoff and Johnson proposed; “…metaphors allow us to understand one domain of experience in terms of another.” (Lackoff and Johnson, 2008 p117).

The metaphor is omniscient in teaching, used consistently in the classroom and in thinking about demystifying how learning happens. The meta-metaphor perhaps?

I have found the concept of Pedagogical Content Knowledge (PCK) (Shulman 1986) to be a useful framework through which to view the schema metaphor.  A recent post by @HFletcherWood, commenting on  Loewenberg Ball et al (2008)’s refinement of Shulman’s original definition,  emphasises the importance of teachers having the “the most effective representations to teach an idea” requiring “careful selection from a good stock of metaphors, models and images.”

I believe it is helpful for students to conceptualise their emerging practice as a form of knowledge as opposed to a skill, building their “good stock” of pedagogical content knowledge through practise, feedback and criticality arising from scholarly activity. It perhaps negates the ‘cult of the personality’, soothing underconfident student teachers in their “I’ll never be like Ms X, down the corridor” moments.

It is the teacher’s role to help their pupils acquire a schema, both the pieces of information within it and the connections between them. Then, it goes without saying that their own schema must be secure. In helping visualise a schema for teaching, consider the elephants (in the room) below as representing how PGCE students might think about the development of their own schema for teaching .

Novice Expert
ac2 ac3

 

In thinking about a curriculum which in part, aims to support PGCE students acquiring PCK schema, it has been helpful to think about the three following intentions.

  • Ensure students know where the ‘dots’, which constitute the body of knowledge, are.
  • Guide them to forge links between the ‘dots’.
  • Assure them, that with time, they will be able to ‘fill in the spaces’

With Biology PGCE students I have found it helpful to evoke an evoke an analogy with the central dogma of molecular biology. DNA to RNAs to protein. Students are asked to consider their emerging PCK schema as a eukaryotic cell. Where the core CK (DNA) is transported out of its domain, to be modified and processed by knowledge from within other ‘domains’ (RNA’s), its structure changed to provide the functional knowledge for teaching (protein). I believe the core CK is adapted through the knowledge of the pedagogy, philosophy, history and nature of science itself. Alongside knowledge of the pupils, how they learn and the context in which they are learning. Students need to know through which domains and how, their CK will be transcribed and translated.

ac4

For the sake of brevity and not over stretching a metaphor, what follows is a description of how I have constructed a curriculum based on this model.  A curriculum which needs to work for students through; lectures and assignments, and classroom practice. I am going to concentrate on:

  • The content Knowledge
  • The ‘narrative’ of that knowledge

The ‘excavation’ of pre-existing content knowledge schema

It is interesting to note that along the spectrum of theoretical opinion of how science should be best taught, from constructivism to explicit instruction, the importance of ‘prior knowledge’ or ‘existing schema’ is prominent.  The duration of the PGCE would not allow for the ground up construction of science knowledge schema. I believe the job of a PGCE curriculum is to gently excavate the pre-existing schema. We also have to recognise that potential science teachers come from a wide variety of degree backgrounds and experience. The ‘archaeological’ re-revealing of the long-lost schema for ‘photosynthesis’ for example, is going to be different for every student. This is where a knowledge-based curriculum delivered pedagogically differs from its andragogically delivered relative. We hand the re-learning over to the learner, but not entirely.

As with most PGCE providers establishing students’ content knowledge happens at the interview stage, which, in my context, involves an MCQ test. Aligned to current specifications, designed also to diagnose pre- and misconceptions, it does end with the inevitable ‘action plan’.  In supporting students in this pre-course phase, indirectly fulfilling the requirements of their action plans, I recommend the @Cam_Assessment archive of examination papers.  The question below, taken from a 1974 A level paper, has proved to be a very telling question for prospective biology teachers.

Source: http://www.cambridgeassessment.org.uk/Images/1974j-biology-alevel-questionpaper.pdf.

I believe it is important to always focus post observed lesson discussion on the emerging PCK and not on the personality of the student teacher. I have found that pre-lesson planning work with students can also be helped by asking them, prior to thinking about activities, to consider and describe the ‘narrative’ of the knowledge and how it will unfold before the class, as they teach.

The narrative of the Knowledge:

I do find this to be the trickiest part of my curriculum design. It requires students to have knowledge and understanding beyond the scientific domain with which they are familiar. They need to have a knowledge of the knowledge as it appears in the national curriculum and beyond. The majority of this learning happens through preparing PGCE students to write at masters’ level.  Depending on their prior experience, students often struggle with the language of the social sciences; philosophy, and history of science and education. This is recognised in our curriculum.  Where I have to admit, I often slip into a more pedagogical approach, we learn unencountered definitions and then explore the concepts, in relation to classroom practice.

As an example, students are encouraged to understand the nature of the knowledge they will be working with as teachers. We consider definitions of different types of knowledge. I have found Winch (2017) useful for students to think about their ‘applied subject knowledge’. Where it is suggested that it is important for a teacher to question the omniscience of the subject knowledge.

“Pictorially the difference is between seeing the subject as a room from the ceiling downwards on the one hand (as a putative expert) and opening a door onto the room on the other (as a novice aspiring to greater expertise)” (Winch 2017 p80/81).

Introducing PGCE students to this form of philosophical enquiry, I believe goes some way towards helping them “fill in the gaps”, incorporating their C  into PCK.

To summarise:

As we will probably never really understand what happens when teaching results in learning:

  • The empirically and philosophically endorsed concept of schema is a useful for PGCE students
  • Although it has limits when considering PCK.
  • I am suggesting a teacher’s ‘knowledge’ should have subject knowledge as its template (the DNA)
  • A template upon which is built, a knowledge of teaching and of the pupils being taught.

References:

Bartlett, F.C., 1932. Remembering: An experimental and social study. Cambridge: Cambridge University.

Ghosh, V.E. and Gilboa, A., 2014. What is a memory schema? A historical perspective on current neuroscience literature. Neuropsychologia53, pp.104-114.

Lakoff, G. and Johnson, M., 2008. Metaphors we live by. University of Chicago press.

Loewenberg Ball, D., Thames, M.H. and Phelps, G., 2008. Content knowledge for teaching: What makes it special? Journal of teacher education59(5), pp.389-407.

Shulman, L.S., 1986. Those who understand: Knowledge growth in teaching. Educational researcher15(2), pp.4-14.

Winch, C., 2017. Teachers’ Know-How: A Philosophical Investigation. John Wiley & Sons.

 

 

 

 

 

 

 

 

 

 

 

 

 

CPD

I would be more than happy to come and join you to deliver CPD. I run sessions either for whole-school or for science specifically. I can present on a number of topics including:

  • What makes expert teaching?
  • Cognitive science
  • Evidence-based practice

If you have something specific that you would like training on I would normally be able to prepare something tailored to your needs.

If you are interested please be in touch with me at adamboxer1@gmail.com

Feedback from attended sessions:

ASE Conference:

“Utterly brilliant. Packed full of insight. ”
“Outstanding food for thought! Great style, engaging presentation. Very useful, will be in touch.”
“Brilliant talk and some excellent ideas put forward!”
“Adam is an excellent speaker so the talk was very engaging.”

TeachFirst Summer Institute:

How participants rated the usefulness of your session:      4.67 /5
How participants rated the quality of the facilitation       4.83 /5

Richard Thompson – Trust Director of Science (West Norfolk Academies Trust):

I will always be grateful to Adam for his inspiring talk on Knowledge Rich Curriculum, which acted as the launchpad for a significant change of approach for science across our trust schools. Every single one of the 30+ Science teachers from across the trust enjoyed his presentation but most importantly understood why change was needed.

Frances Cator, Head of Science at the Dover Federation of the Arts:

Amazing to host the South East coast CPD with Adam boxer at Astor College . Fantastic research led ideas, strategies and  resources. Adam’s ability to make this wealth of research accessible to all is only one of his strengths. An engaging and productive CPD, truly inspiring.

Pixl:

“One of his greatest strengths is his ability to simplify complex ideas and uses classroom examples to illustrate how things should look in the classroom. His session was incredibly tangible and gave me not only strategies to use, but why they should be used and also the resources! I know from recent discussion that his session has not only changed practice in my school but also across our academy trust of over 30 secondary schools”

Hodo Isse, Head of Science, UCL Academy

The expert teaching session provided us with all the tools needed to embed evidence informed practice into our day to day teaching. It also left us feeling inspired and empowered!! It summarises years of research and jargon into simple practical solutions to overcome cognitive overload and  improve student outcomes.  It’s an amazing learning experience for all staff regardless of experience.

 

Create a free website or blog at WordPress.com.

Up ↑