Unit Award Scheme
116061 COMPUTATIONAL THINKING: UNDERSTANDING NUMBER BASES
In successfully completing this unit, the Learner will have | Evidence needed | |
---|---|---|
acquired an understanding of | ||
1 | different number bases, ie binary, denary (decimal) and hexadecimal, and why they are used in computers | Student completed work |
demonstrated the ability to | ||
2 | convert between binary and denary, binary and hexadecimal and denary and hexadecimal | Student completed work |
3 | add together three binary numbers up to a maximum of 8-bits | Student completed work |
shown knowledge of | ||
4 | how the binary, denary (decimal) and hexadecimal number systems can be used to represent whole numbers | Summary sheet |
5 | the principles of binary addition | Student completed work |
acquired an understanding of | ||
6 | the purpose of data and instructions in computer systems being represented in binary form, and why hexadecimal is often used in computer science | Summary sheet |
7 | how a computer recognises data as a series of 1s and 0s | Summary sheet |
8 | what a bit is when working with binary in a computational thinking context | Summary sheet |
9 | how the binary, denary (decimal) and hexadecimal number systems work. | Summary sheet |
All outcomes recorded on an AQA Summary Sheet
Approved 9 August 2021Level - Level Two