Design and Technology
This video looks at how general assessment principles are applied in Design and Technology. Specifically, it explores the different question types used in the subject, use of language, context and imagery in questions and aspects of mark scheme design.
- Transcript
-
Hi, my name is Louise Atwood, and I'm Head of Curriculum for AQA in Design and Technology. Hopefully, you will have seen our video on the general principles of assessment. Today we're going to talk through those general principles and show how they're applied in design and technology. Specifically, we're going to talk through question types, use of language, context and imagery and the mark scheme.
In assessment, we talk about validity. All of these aspects of assessment contribute to the validity of the assessment. We need to measure what it intends to measure, and only that. And in this case, it's a student's ability in GCSE or A-level Design and Technology. So, we're going to talk through the structure of the exam papers now. At GCSE, then, we're splitting the exam paper into three different sections, the first being the core technical principles. And in this section of the exam paper, we use multiple-choice questions, or MCQs, and short answer questions. This allows us to test a wide sample of the core. We cover all material areas and it's worth 20% of the exam. Moving on to Section B then, this tests the specialist technical principles, and we use short answer questions and extended response. This includes an 8-mark extended response question, which is the longest question in the exam paper. Here, we have a more open and unconstrained approach to the question, allowing students to access higher marks. What we want here is for students to show their higher-level thinking skills – for example, analysing and evaluating.
In Section C, we've got the designing and making principles, which consists of mainly 4- to 6-mark questions. These assess application of knowledge and understanding. At A-level, both of the A-levels include two examination papers. These papers take place at the end of the two-year course. Paper One covers the technical principles and is a combination of short answer and extended response questions. Paper Two again uses short answer and extended response questions, but this time to assess the designing and making principles. Section A covers product analysis – this makes use of images within the questions to prompt responses. We'll discuss this in more detail later on in the presentation. And then Section B – this assesses the students’ understanding of commercial manufacture.
For all our design and technology qualifications, maths constitutes 15% of the overall qualification. And across all exams, we use a range of question types to fully test the students’ ability and allow them to showcase what they know and can do. There are three types of question all used in design and technology exams. Do you know when or how they would be used in assessments? Look at the marks that are available. Look at the command words that we use in the stem of the question. These are both clues into the type, and sometimes depth, of response. Here we have multiple-choice questions, or MCQs, short answer and extended response questions.
We're going to start with multiple-choice questions, or MCQs. We use MCQs in GCSE Design and Technology. They are notorious for being difficult to write, but when they're designed well, they can cover a really wide topic area in a short amount of time. They aren't just used for testing recall, but can validly test application of knowledge. As an aside, we have a limit of 20 marks for knowledge in isolation. This ensures that depth and rigour is built into questions, and students need to apply what they've learnt, as well as recall. MCQs are a great way to start the GCSE Design and Technology paper, as they also feel more accessible to students. This builds confidence as they progress through their exam.
Here, we can see that there are several parts to a multiple-choice question. We've got the stem – this presents the problem, or an incomplete statement to be completed. The stem should be written as succinctly as possible, to allow students to understand what is being asked of them. As with all questions, any unnecessary information should be removed and the wording should remain minimal. We’ve then got the key – this is the correct answer to the question. And then the distractors – these are the incorrect answers. So, using MCQs in the classroom then. You can use them to test and revise a wide array of topics really quickly. They can be built into starters and plenaries to assess knowledge and understanding or identify misconceptions. Imagine you've taught papers and boards. We can use MCQs to assess knowledge of all students really quickly, preparing students for this style of question in the exam. They can be a tool for starting discussions around this topic, and a quick revision exercise.
So, let's have a look at this multiple-choice question. How could this question be improved? Pause the video and have a go at answering this multiple-choice question. It doesn't currently assess what we want it to as accurately as it could. So, firstly, the distractors aren't plausible. There are only two types of structure here, which means that we end it with a 50/50 guess on the remaining two potential correct answers. The answer is in the question as well. You can see 'frame' is written here in the question, and 'frame' is one of the answers, giving the question very little validity. Normally, we use alphabetical order to remove any sense of bias or pattern in the distractors, and you can see that that hasn't been done here. You may also notice that there are no command words in the stem of the question. For short answer and extended response questions, this would pose more of a problem, as students would have no signposting as to the type and depth of response that's required from the question. However, in MCQs, command words are not strictly necessary. You might legitimately use 'which' or 'what', for example, because the format of the MCQs already constrain the way in which a student can respond. So, taking all of that into account, a more effective question could be the one shown here in red. Getting the language of the question correct can be difficult. The next few slides take a look at how language can impact on all types of questions in assessment.
So, let's start with command words. Command words are really key in signposting to students the type, and sometimes depth, of response required for a given question. In design and technology, we have a list of command words available for each specification on our website. When assessment materials are written in AQA, they are checked to see that only command words from the published list are used. Consistency across papers is really important to us, and we need to avoid confusing candidates with unfamiliar terminology. It's important that both candidates and teachers know what to expect from the assessment. We need teachers to prepare students fully by using these command words on the website. What we want is for students to be fairly assessed on the knowledge and skills of design technology, and not on their ability to interpret the question.
Look at this exam question – 'Describe a finish used on timbers' – worth one mark. What would your answer be? Pause the video and have a go at answering this question. The command word 'describe' would suggest that detail is wanted from the answer, but how do you describe a finish? A response could be 'paint comes in a tin and can be brightly coloured and quite thick in texture'. You could describe how a finish is applied to timber, but that isn't really what the question wants – not for only one mark. What we actually want to know is that a student knows what finish to use on timbers. So, changing the command word to 'name' is more suitable for this question. In this case the answer would be, perhaps, varnish, paint or stain, for example. This is much more straightforward, and the candidate isn't left guessing if they've described the finish correctly for one mark. Students need to be familiar with the list of D&T command words on the website. They need to have a clear definition for those common command words, and they need to be used to them. So, when you write your own assessment materials, use the correct command words so that they can practice.
So, good assessment material avoids ambiguity. The use of language is a key area that can be problematic for students. It can be a barrier to success, despite their understanding of the subject content. A question needs to be explicit in what it's asking and what knowledge and skills it's aiming to assess. This is particularly true when it comes to writing MCQs. It's important that there can be no possible other right answers. If more than one answer could be plausible, then marking that question becomes problematic. Have a look at this question, for example. Which is the correct answer to the MCQ? It's ambiguous, as technically, any of the adhesives listed in the question are technically feasible, even though the correct answer would be D. As we've used the wording 'Which of the following can be used to join timber?', it could be argued that all should be allowed a mark. It would be much better if we said something like 'Which one of the following adhesives is most suitable for joining two pieces of timber together?'. Adding 'most suitable', and adding more specific information such as 'two pieces of timber together', removes some of the ambiguity.
It's just as important to ensure that all the distractors avoid ambiguity, too. As an example, it could be argued that in model- or prototype-making, hot glue could be used as the correct adhesive. A student may interpret this question in a way the assessment writer hadn't intended, as it's very easy to make assumptions when writing a question. It should not be assumed that all students are thinking the same way. In design and technology assessment, there must only be one correct answer for MCQs, so ‘hot glue gun’ could be replaced with a very definitive incorrect answer to improve this question.
Here's another example. The question says: 'Add to the diagram below to show how you would strengthen the shape to prevent it from collapsing.' Whilst the question seems straightforward, marking this question would become impossible. The question asks how you would strengthen the shape. Personalising this question adds ambiguity, as we could all strengthen this structure in a different way and still be correct. You could, for example, add two cross members, maybe think about triangulation or even add two struts at the back of the shape to seem like it's propping it up. A better way might be: 'Complete the diagram by adding the minimum amount of structural members to prevent it collapsing.' This gives the students more direction, without giving them the answer.
When writing assessment materials, several checks are done by numerous people. It's easy to become focused on an element of the question and view it only from one perspective. It's important that we see many people's perspectives on the same question. We don't want to assume that the way we want the question to be received is, in fact, the way that it will be. Having several people look at the question raises potential problems that the original writer may not have considered because the answer, or process required in attempting the question, was so obvious to them.
Pause the video and have a go at answering this multiple-choice question. Re-read the question and check your answer. Did you pick A? Sometimes, when we read a question really quickly and under time pressure, we tend to skim-read the information, and it would be easy to miss the 'not' in this question. You might read 'conductor' and 'heat' and jump to a quick answer, especially in a multiple-choice question, where there are many to answer in a short space of time. It doesn't mean that a student doesn't know the correct answer if they do this or have the correct knowledge or skills. The problem is even more significant when a question still reads as grammatically correct if the 'not' is overlooked, as in the case of the example on the slide. A question needs to be easy to understand. For this reason, we try to avoid using negatives such as 'not', so that students are not being tested on their ability to interpret the question before they have a go at answering it. Another example written on this slide is: 'Which statements about Thermoplastics is incorrect?' This could easily be misread under time pressure as: 'Which statement about thermoplastics is correct?' Sometimes, you will see in our exam papers that we embolden when negatives are necessary.
So, we're going to simplify this question and make it more appropriate. Maybe pause the video again and write down your thoughts. We want to take phrasing and vocabulary directly from the specification, and we need to do this to ensure all candidates are getting fair access to the question. In this example, we've used the word 'pre-made', or already made, and that needs to be replaced with ‘standard components’. And ‘product’ is referred to as a 'prototype' in the specification. Again, we need to stick to this. This is the wording taken directly from the specification, and allows access for all students.
The example on the right shows where key terminology has been lifted from the specification and used in a question. This highlights the need for students to be familiar with the wording, as opposed to other words used to describe the same thing that may be part of an individual teacher's vocabulary. For example, in the GCSE specification, the term 'order' is for the types of levers. However, it's common to use the term 'class' when describing levers.
Accessibility to the written text is crucial. The classic example to illustrate accessibility is in the words 'utilise' versus 'use'. So, for example, 'utilise the data', or 'use the data'. We would always try and use the most simplistic language. So, in this instance, we would choose 'use' over 'utilise'. Where there's a more easily understood alternative, we will always try to use it. At AQA we write questions succinctly by removing unnecessary language, and we limit the number of ideas expressed. Words that are fundamental to the meaning of the question are sometimes emboldened.
It's not just language that needs to be considered in design and technology to ensure assessment is accessible to all. Why might this question, for example, be difficult for students to access? We need to consider whether all students can access the question, not just some. We could penalise students without meaning to. So, in this example, think about how many 16-year-olds would know how to use a product like this, or would have access to it. How could they therefore be expected to analyse the functionality of this coffee machine? A mobile phone, however, is a product that students would be more familiar with. It's fair to say that it's almost impossible to find a product that all students will know equally well, but consideration must be given to finding something that's accessible as possible to students.
Assessment in design and technology makes use of contexts as a method of testing a student's ability to apply their knowledge in different situations. Contexts can also make a question more accessible to students, as it gives them a real-world scenario through which to apply their understanding, and it may act as a prompt to help them start their response. One way of applying a context to a question may be through the use of imagery. In D&T, seeing a product can assist the student in analysing the key features of that product. For example, seeing a colour image of a product may ensure a student analyses the aesthetics of the product in more depth. In this example here from an A-level paper, Paper Two, two products are shown. Students can evaluate certain features of them both, and they can also draw comparisons. Students can use these images to compare the same products, giving consistency to the assessments. Students can visualise the types of materials, or see the form of the object, so that they can decide how it's best manufactured, for example. But for this to be effective, the images used need to be clear and not contain irrelevant information which could lead students down the wrong path.
Here, students are not being asked what I-beams are, but how they support effectively. Showing them in use may prompt students to think of a more detailed answer, because they can visualise how they’re used. It also helps when we consider: Will all students know where and how I-beams are used?
This example is taken from a GCSE paper. It shows a diagram which has been used in a multiple-choice question. Here, the student is not being assessed on remembering what a CAM mechanism is; it's the change in types of motion that is the focus in this question. Showing the mechanism may help to clarify the question to all students, as they can visualise the mechanism that they've been taught. Students still need to be able to understand and apply the names of the types of motion, so the diagram doesn't offer the answer. Questions that use images, however, should be used with caution. They can introduce problems. Information that is provided, but is not required, can confuse and mislead students. Students may forfeit time and marks by trying to decipher material that's not relevant to answering the question. It's also important not to use context that students would be unfamiliar with, such as the coffee maker that we discussed earlier, because they may feel that they don't have the required knowledge to answer the question when they actually do. It's really tempting to add an image into the assessment to break up what can be a very text-heavy paper, but if not chosen correctly, it can cause confusion that might lead to a student going down the wrong path.
Have a look at these two adapted questions from existing GCSE and A-level papers. Neither example originally included imagery. They don't need it. The first example, here – this is misleading, as the image is showing the light being manually switched on. Example Two shows a generic image of packaging that just isn't necessary. This type of secondary packaging wouldn't include a barcode, so the student might be trying to interpret the question in the wrong way, having been misled.
Other examples of where contexts are used in D&T assessment can be seen in the maths questions. Maths is an Ofqual requirement in design and technology papers, so we have to provide questions in all of our design and technology exams that are maths-focused, but are in the context of design and technology. This is really challenging for students, as applying their knowledge and understanding of maths to a real-world context adds to the overall demand of the question. They not only need to know the maths, but they also need to know the design and technology context, so that they can choose what maths to use. We can help students with this – we can provide scaffolding for them to break the task down into smaller steps. This can mean splitting up the information in a question stem, in the way that you can see in one of our maths examples here, or splitting up larger maths questions across several question parts, as you often see in our papers. In these examples that you can see now, we've also used imagery to further explain the question. In this case, the writer gives several pieces of information about the dimensions of a product. Having it shown visually will ensure students can more readily access the information and visually understand what is being required from the question.
Alongside the exam paper, the mark scheme is crucial in ensuring the validity of our assessments. The mark schemes are developed alongside the question paper, and before the final wording of the questions are decided, the mark scheme is structured to ensure that the focus of the assessment is rewarded correctly. The wording is really, really critical in our mark schemes. It needs to allow for any relevant correct response, even if they’re not the answers that were anticipated. It goes without saying that the mark scheme and question paper should match up. It's important that it's clear in the question what marks are being awarded for, as students should not be penalised if a question has not explicitly asked for something but the mark scheme requires it. For example, how does this mark scheme penalise students answering this question on design companies? A student wouldn't be able to access the 5–6 level of response in the mark scheme, as they wouldn't have known that the examples of specific products were needed in their answer.
We're going to look at two examples of mark schemes that we use in design and technology, at both GCSE and A-level. Here, we have a points-based mark scheme. Marks are given for correct points made by a student. Correct answers are clearly defined, and each correct answer directly corresponds to a mark in the mark scheme. Next, we have a level of response mark scheme, and these are broken down into different levels, each of which have a descriptor. The descriptor for the level shows the average performance for that level. And then we've got indicative content, which is supporting guidance only. This content is not exhaustive, and is just an idea of the kind of responses that we might expect. This type of mark scheme is better for unconstrained questions. Students have a freedom in how they can respond. In design and technology, we typically use them for AO3 extended response questions. Examiners can mark them holistically, crediting a wider range of responses. We treat them as a best fit, identifying the appropriate level and then the mark within that level. The focus in these mark schemes is assessing the quality of student response, rather than assigning a mark per correct point. They can be difficult to word. The mark scheme needs to be detailed enough to guide an examiner and ensure that all correct responses, including those predicted and those that were not, can be assessed fairly, but also not so detailed that they become unwieldy and difficult to apply. Students need to be familiar with levels of response mark schemes. We need to familiarise them with how they can judge the level of their own work and understand how to improve their answers. Self-assessment remains a highly useful tool within the classroom to prepare students for their exams.
Thanks for watching. I hope you’ve found the session useful, and that it's given you some ideas for creating your own quality assessment materials for use with your students. Make sure you watch the general principles of assessment training alongside this one in order to gain a general understanding, and then apply this to the specifics of design and technology. If you have any further queries, please feel free to contact us at dandt@aqa.org.uk.
Questions you may want to think about
- How can you use these insights to prepare your learners for exams?
- Do your internal assessments reflect the approach of the exam? To what extent do you want them to?
- What’s the most important or surprising thing that you’ve learned? How might it influence your teaching?
Mark scheme guidance and application
Find mark scheme guidance courses
Our online courses will give you the tools you need to mark with confidence. You’ll learn how to apply mark schemes for specific qualifications.
GCSE Design and Technology: Mark scheme guidance and application
Location: eLearning
Reference: DATGOE3
A-level D&T Product Design: Mark scheme guidance and application
Location: eLearning
Reference: DTPAOE2
A-level D&T Fashion & Textiles: Mark scheme guidance and application
Location: eLearning
Reference: FASAOE2
Good assessment made easy with Exampro
Find past paper questions so you can make customised assessments for revision, homework and topic tests for GCSE, AS and A-level.
Connect with us
Join the conversation
Contact our team
Become an examiner
Join us and find out how it can benefit you, your students and your professional development.