
This blog is part of the Great Threads Edubloggathon. We love blogs and we love debate, and we’d especially love it if you can join us in trying to help Threads get off the ground. See here for more! Please read and share any thoughts, comments, questions or ideas via Threads.
As a science department, we have put a lot of thought and effort, over the past few years, into revamping our KS3 curriculum, and becoming more intentional in our use of informal formative assessment in the classroom (increased use of strategies such as cold call and mini whiteboards). However, our more formal assessment model hadn’t really changed – students sat a written test following every two topics which teachers marked and reviewed in a subsequent lesson. These tests provided data which could be used to generate report levels, but beyond this, their purpose was unclear, and usefulness limited. I had some concerns about this assessment model:
- Tests generated a marking burden without providing any particularly useful information to support learning.
- Tests came too late. If misunderstandings were revealed it was hard to address them effectively when teaching had already moved on to another topic.
- Without performing time-consuming question level analysis (which we didn’t do) there was no means of identifying topic areas which students were performing poorly in across the year group. Useful information for curriculum review and informing department CPD.
- Even if common gaps in knowledge were identified, answers frequently didn’t reveal why these had arisen or what the underlying misunderstandings/ misconceptions might be.
Having started to use multiple choice questions to check for understanding in my own classroom, I felt there was great potential for using these across the department. Initially, around two to three years ago, I encouraged my team to use the BEST diagnostic questions in their lessons. We spent some time as a department identifying questions which aligned with the topics in our curriculum and collating these in a PowerPoint for use with each topic. However, review later in the year revealed that, although seen as helpful, these questions were rarely being used because they were not embedded within the curriculum. They were a bolt-on which really wasn’t working.
Time to reconsider assessment
The question this left me with was, could we make use of diagnostic multiple choice questions within our curriculum in place of the more traditional written assessments? Together, we discussed and came up with a new assessment schedule – students complete a multiple choice quiz electronically (we have chromebooks now, but initially these were completed on phones or on paper if the teacher preferred) at one or two points during a topic (depending on its length). Quizzes are between 12 and 16 questions long, and are set using Google Forms. These quizzes have the following advantages:
- Diagnostic questions are consistently used across the department and allow for gaps/ misunderstandings to be identified during the topic whilst teaching can still be adapted to address these.
- Quizzes are self-marking. This has the dual benefit of reducing marking workload, and allowing for instant feedback – the quiz is completed in the first 20 minutes of a lesson, the remainder of the time being used to review common errors and address any gaps/ misunderstandings which have been revealed.
- The diagnostic nature of the multiple choice questions means that wrong answers provide valuable information about what students are thinking which supports teachers in addressing these misunderstandings (see examples below).



Of course, these quizzes do not replace the informal formative assessment which is so important in every lesson and which hopefully picks up on many gaps and misunderstandings before students get to the point of taking the quiz, but they do provide another (consistent) opportunity for persistent errors to be noticed and corrected.
These quizzes are not a perfect solution. Because they are frequent, they happen quite soon after teaching and may be checking performance rather than learning, in some cases, although the multiple choice nature of the questions hopefully mitigates this to some extent – understanding rather than simple recall is required to correctly answer many of the questions. Another objection raised within department discussion was that students need to have experience of sitting written exams as this is likely to be how any public exams they eventually sit will be administered. It’s also important for students to revise and be tested on a larger body of knowledge than the content of the past six to eight lessons in order to encourage longer-term learning. Thus, the second component of our new assessment model are mid-year (in December for Y8 and January for Y7 to spread the marking burden) and end-of-year written exams which draw on all the content covered in the year to that point. These do require marking by the teacher, but this is still a big reduction compared to the number of topic tests which were previously being marked.
Reflections following implementation
This assessment regime was introduced with Year 7 in the 2022-23 academic year. Feedback from teachers was overwhelmingly positive with the following key points coming out of department discussions:
- Marking workload substantially reduced.
- Teachers felt they had a clearer understanding of what their students were finding challenging and a better knowledge of individual student’s understanding.
- Teachers commented on how the MCQ quizzes helped them to identify areas for improvement in their own teaching.
- Teachers felt that they were better able to address gaps and misunderstandings in a timely way following the quizzes.
- Students know that these quizzes are coming regularly and want to do well in them – they know that doing their regular retrieval home learning will support them in this.
- There were some initial glitches in the practicalities of setting the quizzes online, and in ensuring that all students had access to a device on which to complete them – acquiring chromebooks for use in lessons has resolved these.
- Concern was also raised that some students might be copying those in front of them as it’s easy to see which options were being selected on an online multiple choice quiz being completed on a chromebook. This has been mitigated by randomising the order of the questions in the quizzes so students are answering questions in a different order to those around them.
It’s important to also note negative outcomes from any changes we make. One which has recently come to light, following Y8 mid-year tests, is that a couple of teachers have noticed students with weak literacy in their written answers, which they feel they would have picked up on earlier with our previous topic tests. This is something for us to be aware of in future years, ensuring we pass any important information about students on to new teachers at the start of the year, and to be more conscious in looking out for similar concerns when circulating the classroom and looking at students’ work. Perhaps we should consider setting a brief written task for teachers to check early on in the year.
Using evidence to refine curriculum
This year, as we continue the roll-out to Year 8, we have also started to look in more detail at what the quiz responses can tell us about our curriculum and/or teaching as a department. As quizzes are completed, we are taking time in department meetings to review responses and identify the questions where errors are common across the year group. This is very easy for multiple choice quizzes and if there is a particular wrong answer which is commonly selected, this may indicate an area in our curriculum resources or teaching practice which we need to work on improving collectively.
I will outline one example from our initial review of quiz responses from before half term. Year 7 have learned about heating and cooling curves as part of the Particle Model topic. They have studied changes of state, heating and cooling, and have obtained data and plotted heating and cooling curves, as well as learning about the theory underpinning the shape of the curve. Teachers have checked for understanding of this during lessons, and students have completed questions from their booklets to consolidate their understanding. However, the multiple choice quiz identified their understanding of heating and cooling curves as being quite insecure, particularly in identifying the state or change of state represented by particular sections of the curve.


In discussion we shared how different teachers approach teaching this aspect of the topic, and discussed how we might be able to provide students with additional practice to support their consolidation of these concepts. As a result, we plan to add some questions including images of heating and cooling curves for students to interpret to the Carousel Learning quizzes which students complete for their regular home learning. A video of an example model explanation of this process has also been shared in the department drive to support teachers who may feel less confident in explaining these concepts. Next year, we will review whether these questions are answered more successfully following the alterations to the curriculum.

Conclusion
Hopefully this post has given some food for thought about how and why we use different forms of assessment, as well as providing a view of two things:
- An overview of an assessment model which combines some key aspects of good assessment – clear purpose, the ability to surprise and therefore inform future teaching (both in the short term with a particular class or student, and in the longer term as a department), a manageable workload, and encouraging students to review learning more than once throughout the year.
- An example of implementing a change at a department level which has taken significant time, collaborative effort, acknowledgement of mistakes and refinement of processes, but which has resulted in a better system for workload and, hopefully, learning, and which is informing on-going curriculum refinement and the use of department time for subject specific CPD – a big focus of mine for this year.
I’d be really interested to hear any questions, comments, or critique of our approach. To engage in discussion, please comment below or share via Threads where you can find me as drhskelton.