“Don’t worry, they always get that question wrong, everyone struggles with this test.”
I recently found myself saying this in conversation with a colleague in my department. Comments which people had been making about this particular test for as many years as students have been sitting it. Reflecting on this conversation in the light of what I’ve been learning about assessment through the Assessment Lead Programme from Evidence Based Education I realised that this wasn’t a conversation I should be having. If different cohorts of students are consistently finding an assessment challenging and making the same mistakes year after year, this is not something that should be dismissed as being down to a difficult topic or a challenging test, but something which should cause me to ask questions. Why are students finding the test so difficult? What is it that is preventing them from answering the questions correctly? Is the test actually assessing what I need it to? Do I need to alter something in the way material is being taught?
Learning is all about developing the novice schema our students bring into the classroom towards the expert schema that we have developed through our more extensive study of our subject disciplines. We know that learning takes place in the context of what we already know (Willingham, 2010), so the ability to appropriately and accurately assess the development of our students’ schema is crucial, and is something to which I have not given nearly enough thought. I have been spending a lot of time thinking about lesson planning and delivery, including regular assessment of key knowledge through retrieval quizzes, and being more deliberate in recapping prior learning at appropriate points to boost knowledge retention and encourage students to build links between topic areas. But I have blithely continued to use the same types of assessments that I have been using for years. For GCSE students these are tests made up of past exam questions. Is this really the most appropriate form of assessment for these students?
Here I reflect on an example of how I have used assessments which I now consider to be unfit for purpose and how I think they could be improved. The topic in question is Structure, bonding and the properties of matter from GCSE Chemistry. (If you’re not a scientist, bear with me as I hope the principles of what follows will be applicable more widely.) This is a big topic in GCSE Chemistry which focuses on the different types of bonding which occur between atoms (ionic, covalent and metallic) and how the properties of the main structure types (giant ionic, giant covalent, simple molecular and metallic) can be explained by understanding the nature of the bonding between their constituent particles (atoms, ions or molecules). I recently taught this topic in the following sequence:
- Ionic bonding, ionic structures, properties of ionic structures.
- Covalent bonding.
- Giant covalent structures, properties of giant covalent structures.
- Simple covalent molecules, properties of simple covalent molecules.
- Mid point test – past exam questions.
- Metallic bonding, metallic structures, properties of metallic structures.
- Nanomaterials and their properties.
- End of topic test – past exam questions.
Despite students appearing to have understood the content of the topic during lessons where they were able to accurately describe and explain the properties of different structure types, when it came to the midpoint test, similar questions were answered poorly. For example, shortly before the test, in a lesson on simple covalent molecules, most students had accurately answered the question, describe and explain the properties of chlorine. A related question in the test which few students answered accurately was, the bonding in iodine is similar to that in chlorine, explain why iodine has a low melting point.
Around the time that I was marking these tests I read an excellent blog post from Adam Boxer (Boxer, 2019) about what to do following a test in which he wrote about an example of how he was planning on revisiting the links between the properties and structure of different states of matter, making more explicit the distinction between the properties and the structure and how one can be used to explain the other. I decided to take a similar approach in revisiting the structure and properties of substances resulting from the different types of bonding. I spent a significant period of lesson time recapping this knowledge building up notes under the visualiser together with my class with lots of questioning and examples.
I then spent time modelling with students how they should approach these questions, giving them a structure to follow in developing an answer as this type of question always requires a similar response:
- Identify the structure type.
- Identify the properties are you explaining.
- Explain why this structure type has these properties.
We worked through a couple of examples together on the board before students wrote further answers independently. Their answers were much better than those in the test and several students commented on how they felt they now understood the topic.
Students seemed to be making progress, so I continued with teaching the rest of the topic, giving plenty of practice at identifying structure types and reviewing prior learning regularly. My class sat the end of topic test a few weeks later. This test covered more content than the midpoint test and there were a few short answer questions, but the style of most questions was similar and we’d done lots of practice over the intervening period. Some typical questions from the test included:
- Some welding blankets are made from silicon dioxide which does not melt when hit by sparks or molten metal. Describe the structure and bonding in silicon dioxide and explain why it is a suitable material for making welding blankets.
- Explain why oxygen is a gas at room temperature.
- Magnesium oxide is a white solid with a high melting point. Explain these properties with reference to the structure and bonding in magnesium oxide.
- Graphite is softer than diamond. Explain why.
- Graphite conducts electricity, but diamond does not. Explain why.
I felt confident that more students would be able to tackle these questions following all the work we’d done since the midpoint test. The reality was that the students who had been able to answer questions well at that time still could, while most of the others were still making the same mistakes.
My class had spent two lessons sitting tests, most of them had made the same mistakes on both occasions and I felt that I was no better informed as to why. This made me think about the design of the assessment. The tests had revealed to me that a lot of my students were unable to answer exam questions which required them to identify a structure type and explain its properties in relation to the type of bonding involved. However, because most of the questions on the test required the students to remember to think about, and correctly identify the structure type, recall the properties of that structure, and explain how the bonding in the structure related to those particular properties, in addition to remembering to use all the correct terminology along the way, I was none the wiser as to the root cause of the poor answers.
The problem could have come at many stages. Perhaps they forgot the how to structure their answers. Maybe they couldn’t remember how to identify which structure type was involved. Perhaps it was the recall of the properties for each structure type that was missing. Or it could be that they hadn’t understood the difference between ions and molecules or covalent bonds and intermolecular forces. The tests my students had sat simply didn’t provide me with this information. I couldn’t use them to unpick where the misconceptions or gaps in their knowledge lay. They do need to be able to answer these exam questions, but not for another 18 months. Right now I don’t need to know whether they can answer an exam question. I do, however, need to know what they don’t know and what they don’t understand so that I can plug the gaps. Knowledge of this topic will be vital in understanding numerous other aspects of chemistry that they will be studying, this knowledge needs to be secure.
On reflection, I could easily have made this test more useful. Instead of the single question, some welding blankets are made from silicon dioxide which does not melt when hit by sparks or molten metal. Describe the structure and bonding in silicon dioxide and explain why it is a suitable material for making welding blankets.
I could have broken this down into parts:
Some welding blankets are made from silicon dioxide which does not melt when hit by sparks or molten metal.
- Identify the type of bonding in silicon dioxide.
- What structure type does silicon dioxide form?
- What property of silicon dioxide makes it suitable for use as a welding blanket?
- Explain this property in terms of the bonding in silicon dioxide.
This would have tested exactly the same material, with several benefits. It would:
- give students more structured practice in tackling this style of question (I could still have left one question as it was, or have faded the structure through the course of the test paper to determine whether the underlying knowledge was insecure or if the challenge was applying it to the more open ended questions).
- be likely to give all students some level of success, leaving them feeling positive about their ability and more motivated to learn.
- have enabled me to identify at which stage the knowledge and understanding of my students was breaking down, giving me the information I needed to make best use of time in future lessons.
We too often focus on the end game when designing assessments at GCSE. Rather than always using past exam questions (which might well be suitable for an end of year exam or in some topic areas) because that’s what students ultimately need to be able to answer, we should think carefully about the purpose of the assessment. What information about student learning do we wish to gain from the assessment? Is it going to provide us with this information? Is the assessment going to be useful in helping us to plan the next steps to secure student learning? If the answer to the last two questions is ‘no’ then we probably need to think again about using the assessment, whether it’s a big test or a small part of a lesson.
In any functional design process, having a clearly defined purpose is essential to success. Assessment design is no exception.
Willingham (2010) – Why don’t students like school? Jossey Bass.
Boxer (2019) – https://achemicalorthodoxy.wordpress.com/2019/11/21/what-to-do-after-a-mock-into-the-classroom-with-whole-class-feedback/ (accessed on 25/1/2020).