Environmental Science
This video looks at how the general assessment principles are applied in A-level Environmental Science. It considers the importance of assessment objectives and command words and how they are used in questions. The video also looks at the use of resources in our question papers and considers the different types of mark schemes.
- Transcript
-
Hi there, I am Martin Parham and I'm the chief examiner for A-level Environmental Science. Hopefully you'll have already seen our what ‘makes good assessment’ videos about the principles of assessment, which cover concepts such as validity and reliability and what those terms mean. In this video, I'm going to talk about how those principles apply in environmental science.
My aim in this session is to give you a better understanding of the stages of assessment, i.e., how we construct an exam paper using our general principles. We will have a look at examples of how we apply these assessment principles to both the question paper and the mark schemes. I will also have a look at how we review papers and their outcomes and build this knowledge into our future planning so that we can improve the experience for any future students taking our exams. The format of this video offers both explanation and give you examples that you can use in the classroom.
First of all, let's remind ourselves of the format of the A-level environmental science exam. We have two papers, paper one and paper two, which both represent 50% of the A-level exam. Each exam is three hours long and worth 120 marks. Paper one covers the physical environment, energy resource and pollution, while Paper 2 covers the living environment, the biological resource (agriculture, fishing) and sustainability. Both exams include questions on research methods which include practical skills and mathematical skills. Both papers are comprised of multiple choice, short answer and extended writing questions.
So, what are the key principles for assessment in both of our papers? Firstly, validity. The concept of validity links to the extent a question is measuring what it was designed for. This can be affected by multiple things such as poor wording, use of complex terminology and failure to address the assessment objectives. In environmental science we also use recent research so that our questions are based in real science. Next, it’s reliability. Reliability helps us understand whether we are being consistent and accurate in our measurement of learning. It is important for us when setting questions to follow the same principles year on year so that we are being fair to the candidates. Thirdly, discrimination. The concept of discrimination links to both validity and reliability as we are required to discriminate between students so that those with the greatest knowledge and ability achieve the greatest proportion of marks. A paper where everyone attains 100% wouldn’t achieve this, so we aim for an increase in difficulty throughout each question and across the papers to differentiate between the students. Finally, accessibility. The most important element of accessibility is whether students can answer the questions set. After two years of study, we want to be able to test a student’s knowledge and understanding of environmental science rather than their ability to understand the question. Accessibility can also take the form of clear formatting on the paper and a familiarity with how a question is set. For example, we try to organise the paper in a way that is similar to past papers and offers the student a clear framework for answering.
We are now going to look at how we apply the assessment principles to A-level Environmental Science question papers. Command words are an incredibly important tool to ensure that the paper is accessible to students. The incorrect use of command words could mean that we're asking students to complete something that they haven’t learnt. Consistent use of command words in successive examination series means that centres can be confident that the use of command words practiced in mock exam papers will be the same as in the live exam. Command words are also important in terms of validity because they help determine exactly what the question is asking for and what is being tested.
AQA has a list of command words which are used in environmental science papers, all shown on screen now. One of the more common command words used is ‘describe’, which often requires an understanding of a concept, saying what something does, without explaining it. ‘Explain’ is another common command, which requires an identification of a concept, factor, or idea and an associated development to give further detail on how, or why, something happens. These two commands are particularly visible in short answer questions. When a figure or data are used in a question it will relate to content on the specification but may not be something the student has seen before. In these circumstances , the command word ‘suggest’ is often used. I will show an example of this on the next slide. For extended writing questions that are worth more than four marks then other commands such as compare, evaluate and discuss are commonly used. These commands are used because they often require greater depth and focus on a range of concepts and require students to demonstrate different skills.
Looking at the example of a two-mark question on this slide we can see that the command word ‘suggest’ has been used. This is because although understanding features of energy density and biofuels are included within the specification, the question is focused on suggesting why one set of data would be more useful than the other, and as this is not directly referenced in the specification, we asked for students to ‘suggest’, rather than ‘explain’ why.
To achieve accessibility and consistency in exam papers we often use commands and formatting which reflects the understanding required in the specification. In this example, the question asks students to describe similarities and differences of carbon sequestration and carbon capture and storage. This type of question could come earlier in the paper as shown in the example or, as a later question which requires a more detailed comparison so that we can achieve an increase in challenge across the paper and across each question.
In this example, students are asked about variables linked to a study. The specification indicates that students should develop the skill to identify variables, however this question also asks students to explain variables. The reason we can ask students to explain in this question, is because this is listed as a ‘practical skill within the specification. If this was not listed as a practical skill, then we would have to ask for a suggestion. In this question, two marks are available for the identification and two marks are available for the explanation. To help students access this question, the question paper includes a structured approach, so students understand how marks are distributed and what is required. As this question requires both identification and explanation, it would probably appear either later in the paper or as a later item in an earlier question.
Here is another example of an explain question which linked to a figure about change caused by El Nino in the Pacific Ocean. To support students, we related the content of changing ocean currents and El Nino to the Pacific Ocean which is the most commonly taught application of this idea. This question does not have the structure included in the previous example. This format for an explain question is common towards the end of the paper as we consider this topic to be more challenging.
For extended questions which are commonly found later in the paper, we tend to use command words that give students the opportunity to explore concepts in depth or with a range of ideas. In this example, the use of the word ‘evaluate’ when looking at the success or failure of a particular approach or policy is common. In this case we ask students to evaluate methods and strategies to reduce global climate change.
In this example, the command word ‘compare’ requires students to look at two examples of pollutants and their environmental effects. The use of the word compare often refers to the differences between a small range of ideas or concepts. In this case it would not have been appropriate to evaluate the environmental effects of these two pollutants. The term evaluate normally is used to look at the positives and negatives or a scheme, policy or action. Rather in this example the idea of comparison gives the candidate the opportunity to show their understanding of the effects of each pollutant and show the differences.
On both papers, question 11 is an optional essay question allowing students to select their preferred question. To make these two available questions comparable, the questions use similar wording and/or entail similar requirements to achieve the same level of demand. In the example on screen, we have used an identical command word and similar wording in each of the questions as students are required to understand the management of either water or mineral resources, and how this meets the demands of society whilst minimising the impact upon the environment. In this example the command word remains consistent, however the ensuing wording is not as similar as in the previous example. These questions are still entirely valid though as the requirement in each of the questions remains consistent as they ask for how an understanding of an environmental concept can make improvements to human activity.
Now, time for an activity. On screen is a question which was drafted and subsequently rejected for one of the environmental science exams. In this question students are asked to describe the advantages and disadvantages of the methods and sampling techniques used by the students and suggest how they could extend the study to determine the impact of light availability on forest ground flora. Are you able to pause the video and work out the three reasons that all contributed to this rejection?
I’ll now discuss the three separate reasons that contributed to this question being rejected. Firstly, this question was rejected due to its lack of accessibility. This question has multiple parts to it, drawing from a range of different sections of the specification. It’s also a particularly wordy sentence, therefore students may not be clear on how many methods or techniques, advantages or disadvantages or suggestions they should make to extend the study. Due to all these factors, this question was not considered suitably accessible. Secondly, it’s uncommon for the examining team to ask a question with multiple commands, therefore this question would have been inconsistent with previous examinations. Finally, it would be difficult to create a mark scheme which accurately reflected what students were asked to complete in this question, and as this is a vital component to any question, it could not be considered a valid question.
Assessment objectives are an important tool which help us to further achieve both reliability and validity on our exam papers. Each paper has a percentage allocation of marks at a particular assessment objective. This helps ensure that year on year our papers follow a similar approach and do not focus more on one element of assessment over another. There are three assessment objectives in environment science, and it is worth knowing the difference between them and how they apply to questions. The first assessment objective, AO1, links to questions which show knowledge or understanding of a concept, a process or a procedure, which links directly to content in the specification. So, if a question asked students to explain the process of El Nino, this would be an AO1 question. The second assessment objective (AO2) measures how students apply their knowledge and understanding from the specification. Ability in AO2 is assessed within questions where students are asked to apply knowledge to a particular situation which may not be familiar to them. This unfamiliarity could be included in the context of the question, a resource or through the request for students to link their understanding of different areas of the specification content. The third assessment objective, AO3, requires students to analyse data, draw conclusions or evaluate.
On screen, you can see a question asking students to ‘Explain how five environmental impacts may be reduced”. As this question links to reducing the environmental impacts of mining, which is within the specification, it is only assessing what students should know and understand. Therefore, it is considered AO1.
In this second example, students are asked to apply their knowledge of carbon and the stores of carbon, which is included in the specification, to the changes in three different stores which may have occurred since 1850, which is not explicit in the specification. Therefore, it is considered AO2.
In this final example, this question is focused upon the analysis of data to formulate an answer, this question is considered AO3.
Another consideration when producing papers is resourcing. The aim of the examining team is to use published resources based on real data and science; however, this may present a challenge if the information is not accessible. Therefore, we adapt our sources so that they are accessible to students. We aim to present only the information that is relevant to the question. Question writers also try to match wording from a resource to both the stem of the question and the question itself.
As an example, this question asks students to compare the period of 1964 to 2012 and 2013 to 2020. For this reason, only these dates appear on the figure which helps students focus the question.
This second example is a more complex resource from paper 2 looking at flight behaviour of bats over different areas of land use. Although this is a complicated resource including information on speed and height, it gives students plenty of opportunity to state three differences in the flight behaviour of the male and female bats, as dictated by the question.
The process of writing question papers often starts with the creation of a mark scheme. In environmental science we encourage the practice of creating the answer content we want and then adapting the question to match. This approach helps us create questions which are valid. We have also tried to create a standardised approach to our mark schemes so that both examiners and centres can use them consistently.
For short answer questions that are worth between 1 and 3 marks, we will either provide a definitive list of answers and ascribe marks or we will provide a range of possible answers and ask the examiners to refer to these when applying the marks. While the answer in the mark scheme is the preferred answer, we understand that students in exam circumstances are under a great deal of pressure and therefore we offer latitude if they do not use our exact wording.
In this second example you can see how mark schemes are applied to questions which have different commands. For example, here students were asked to identify two functions of carbon dioxide and how carbon dioxide aided survival. As two marks were available for the functions and two marks were available for the explanations, these appear on the mark scheme as couplets with the factor and the explanation linked. In some cases where there are multiple answers, we will offer examples of the most likely answers but also allow valid responses that are not included on the mark scheme.
In maths mark schemes, we format the scheme to show the answer in the marking guidance and the calculations in the comment area. This is designed to help examiners mark the responses accurately and consistently and it also supports teachers in aiding students to organise their answers effectively.
Although most of the questions in environmental science papers are point marked, there is always at least one 9-mark question and one 25-mark extended question which are marked using levels of response mark schemes. The example shown here is a level of response grid for a nine-mark question. This mark scheme will be accompanied by indicative content which gives the examiner an idea of the typical answers we expect, although students are not expected to include all the indicative content in order to attain full marks. Use of levels of response mark schemes helps us mark students' responses where different, equally valid, approaches are taken.
I hope you have found the session useful and that it has given you some ideas for creating your own quality assessment materials for use with your students. If you have any further queries, please feel free to contact us on the email onscreen and if interested, we’d encourage you to apply to become a member of our examining team. Thank you.
Questions you may want to think about
- How can you use these insights to prepare your learners for exams?
- Do your internal assessments reflect the approach of the exam? To what extent do you want them to?
- What’s the most important or surprising thing that you’ve learned? How might it influence your teaching?
Mark scheme guidance and application
Find mark scheme guidance courses
Our online courses will give you the tools you need to mark with confidence. You’ll learn how to apply mark schemes for specific qualifications.
Good assessment made easy with Exampro
Find past paper questions so you can make customised assessments for revision, homework and topic tests for GCSE, AS and A-level.
Connect with us
Join the conversation
Contact our team
Become an examiner
Join us and find out how it can benefit you, your students and your professional development.