Author Archives: Drew

Perimeter Institute – EinsteinPlus 2016 – Day 3⤴

from @ stuckwithphysics.co.uk

Day three began (after breakfast) with a session on Quantum Mechanics. The session was based around the 'Investigating the Nature of the Electron' activity from the Perimeter Institute's materials on 'The Challenge of Quantum Reality'.

IMAG1130

The first task, 'Classical Particle Behaviour', uses very simple apparatus - sand and a paper coffee cup, to model the behaviour of particles passing through two narrow slits - Young's slits experiment. The task asks students to make a prediction of what they will see, encouraging them to explain their reasoning before continuing with the procedure of passing a small amount of sand through two narrow slits cut into the base of the cup.

As expected, two small piles of sand are obtained,

Perimeter Institute – EinsteinPlus 2016 – Day 3⤴

from @ stuckwithphysics.co.uk

Day three began (after breakfast) with a session on Quantum Mechanics. The session was based around the 'Investigating the Nature of the Electron' activity from the Perimeter Institute's materials on 'The Challenge of Quantum Reality'.

IMAG1130

The first task, 'Classical Particle Behaviour', uses very simple apparatus - sand and a paper coffee cup, to model the behaviour of particles passing through two narrow slits - Young's slits experiment. The task asks students to make a prediction of what they will see, encouraging them to explain their reasoning before continuing with the procedure of passing a small amount of sand through two narrow slits cut into the base of the cup.

As expected, two small piles of sand are obtained,

Perimeter Institute – EinsteinPlus 2016 – Day 2⤴

from @ stuckwithphysics.co.uk

Day 2 of EinsteinPlus 2016 saw the group formally welcomed to the spectacular Perimeter Institute building after an equally spectacular breakfast. (There are two excellent bistros at PI, which provided the group with a fabulous range of meals over the week long visit. I'd say more, but there'd be a real danger of this becoming a food blog...)

The morning session was split into two -

  • Cosmology - this used an existing PI activity 'The Signature of the Stars' from their educational resource on 'The Expanding Universe' - using diffraction glasses observations were made of line spectra from a variety of gas discharge lamps. These spectra are used to identify the elements present in stars, in the Milky way and in distant galaxies. The spectra of light from distant galaxies shows the same spectral lines as stars in our galaxy, but the lines appear in slightly different positions, with longer wavelengths. This effect, known as Red Shift, occurs because the galaxies are moving away from us, and each other, at high speeds. Measuring the red shift for a galaxy can be used to measure its speed, which relates in turn to its distance from us. This effect was first observed in the early 20th century and used to formulate Hubble's Law - which states that not only is the universe expanding, but the further away from us a galaxy is, the faster it is moving. The activity includes data allowing the red shift of a range of galaxies at different known distances to be used to find their speeds. This data is then plotted it give a graph representing Hubble's Law, which gives an approximation of the Hubble constant and can in turn be used to find the age of the universe.

  • Gravitational waves - this used a newly developed activity based around the recent detection of Gravitational Waves at the Laser Interferometer Gravitational Wave Observatory (LIGO) facilities in the USA. The facilities use extremely large scale (~4km) laser interferometers to measure incredibly small expansions or contractions (~10-19 m - 1000 times smaller than the diameter of a proton) of the devices which occur when gravitational waves pass. There are many areas of physics and engineering involved in the development and operation of the LIGO detectors, from the solutions to Einstein's General Relativity which predicted the existence of Gravitational waves, to the intricate suspension of the mirrors sued to improve the sensitivity of the detectors - developed at the University of Glasgow. The activity centres around the properties of waves, and their behaviour when they undergo reflection - beginning with demonstrations of mechanical waves using a slinky. Observations of phase change upon reflection were developed upon and related to the operation of the interferometers at LIGO. These ideas were utilised in a hands on activity to simulate the paths of the laser light used at LIGO in order to find the nature of the light detected when the device is unstretched (no gravitational wave) and stretched. This task offers an excellent opportunity to link this part of the Advanced Higher physics unit on waves to a context which involves real, cutting edge physics.

LIGO unstretched

LIGO stretchedAfter lunch, followed a two more sessions -

  • Neutrino Detection - another new activity, this is based on the Nobel Prize winning work of Professor Art McDonald and his team at the Sudbury Neutrino Observatory (SNOLAB). The session began with an overview of the production of neutrinos in the sun and the difficulty in detecting these particles - the 'Solar Neutrino Problem'. The session continued with a description of the facility at SNOLAB and a hands on task modelling the detector using marbles, cardboard boxes and a baking tray. There was a great deal of discussion about this task, and the nature of the model to describe and explain neutrino detection. Consequently there was a shortage of time for the remaining tasks, dealing with real data from SNOLAB and the theory of 'neutrino oscillation'.

  • Dark Matter - this session used the 'Dark Matter Within a Galaxy' activity from 'The Mystery of Dark Matter' materials. The activity begins with a revision of the basic rules for circular motion and gravitation, using a range of data to find and plot the orbital speed of a star against its radius from the centre of its galaxy. These values, calculated from classical theory, do not compare well with observational data - implying that there must be more mass in these systems that we can not detect - Dark Matter. Whilst the part of the underlying physics of this task, circular motion, is beyond the scope of the Higher physics course in Scotland, it might be fair to use this as a practice data handling task which could be used to exemplify and reinforce the very brief mention of Dark Matter in the 'Our Dynamic Universe' unit.

The final session of the day was a keynote presentation delivered by Professor Avery Broderick from the University of Waterloo on the Event Horizon Telescope (EHT). This program uses nine existing telescopes across the globe and applies a technique known as Very Long Baseline Interferometery (VBLI) to improve the resolution at which images of very small objects can be made.

It is hoped that by improving the resolution for existing telescopes and including planned future telescopes in the gathering and processing of data, the EHT will obtain the first direct images of the event horizon for a black hole in our galaxy. Recent observations in the constellation of Sagittarius are thought to indicate the presence of a black hole with a mass around 4 millions time that of our sun. This black hole is of the right size and at the right distance for the EHT to be able to make observations that will allow an image to be obtained in the next few years.

The data gathered and images obtained by the EHT will allow for further testing of Einstein's theory of General Relativity, and provide a greater understanding of phenomena such as black hole accretion and plasma jets.

After this presentation and another excellent meal the group was offered a tour of the Perimeter Institute building, offering an insight into how the facilities have been designed and developed in order to attract and facilitate the work of some of the world's foremost theoretical physics (not to mention a very large number of teachers and students).

A selection of images of the building will be included in a gallery as soon as I figure out how to make it work...

 

Perimeter Institute – EinsteinPlus 2016 – Day 1⤴

from @ stuckwithphysics.co.uk

Earlier in the year I was delighted to receive an invitation to the annual Einstein Plus teachers' summer school at the Perimeter Institute in Waterloo, Ontario.

A total of forty two teachers from across Canada, USA, Europe and Asia attended the event, comprising workshops, lectures and visits from the 6th to the 12th of July.

Having registered and settled into our accommodation for the week, the first evening was given over to a delicious meal followed by a session of group 'ice breaker' tasks at the Hawk's Nest, Wilfrid Laurier University - one of two universities in the city of Waterloo.

One of the tasks was on the 'Process of Science' and involved the use of some small wooden cubes. Each of five of the faces carried a name and two numbers as shown in the images below.

Cube 1 Cube 2

Groups were asked to make observations, try to identify relationships and then predict what could appear on the blank face of the cube. Each group was able to complete all of these steps, but interestingly no two groups came up with exactly the same solution. All of the suggestions given were equally valid, given the evidence on which they had been based. This task made an excellent introduction to the idea of scientific modelling, a theme which would be returned to throughout the week.

Newton’s G-ball⤴

from @ stuckwithphysics.co.uk

'Newton's G-ball', marketed by Swedish company Mollic, is a simple electronic timing device which can be used to measure the the freefall time from its point of release to impact on a surface below.

It is available from a number of third party suppliers, including djb microtech and Better Equipped in the UK and Arbor Scientific in the US.

gballThe ball has an integral centisecond timer, which is primed by pressing and holding the button on the face of the timer. Releasing the ball starts the timer, which stops when the ball impacts upon a surface below.

If the height, h, through which the ball falls is known, and the time for the ball to fall, t, is measured, then g can be calculated using the formula -

equation

Taking multiple measurements of the freefall time, t, over a range of heights, h, allows a range of values to be obtained for g.

The results below were obtained by my Higher Physics class on 9th June 2016.

h (m) t1 t2 t3 mean t (s) g (ms-2)
0.2 0.23 0.22 0.27 0.240 6.94
0.4 0.30 0.27 0.30 0.290 9.51
0.6 0.39 0.35 0.38 0.373 8.61
0.8 0.42 0.39 0.41 0.407 9.67
1.0 0.46 0.48 0.46 0.467 9.18

The results obtained are reasonably good, giving a mean value for g = 8.79 ms-2. Whilst this is in reasonably close agreement with the quoted value of 9.8 ms-2 given in the SQA data tables, discounting the obviously low value obtained for h = 0.2 m gives an improved mean value for g = 9.51 ms-2.

A quick analysis of the uncertainties in this data give the following -

Uncertainties in height, h (approximate reading/position uncertainty = ± 0.02 m)

h (m) uncertainty in h (m) % uncertainty in h 
0.2 0.02 10%
0.4 0.02 5%
0.6 0.02 3%
0.8 0.02 3%
1.0 0.02 2%

Uncertainties in time, t -

h (m) t1 t2 t3 mean t (s) random uncertainty in t (s) uncertainty in t 
0.2 0.23 0.22 0.27 0.240 0.017 7%
0.4 0.30 0.27 0.30 0.290 0.010 3%
0.6 0.39 0.35 0.38 0.373 0.013 4%
0.8 0.42 0.39 0.41 0.407 0.010 2%
1.0 0.46 0.48 0.46 0.467 0.007 1%

Uncertainties in g -

g (ms-2) mean g 

(ms-2)

random uncertainty in g (ms-2)
6.94 8.79 0.55
9.51
8.61  % uncertainty in g absolute uncertainty in g (ms-2)
9.67  8% 0.70
9.18

This gives a final value for g using this procedure as -

g = (8.79 ± 0.70)  ms-2

However, an alternative graphical analysis allows an improved result to be obtained from the same data.

For this approach, the formula above was rearranged for h, giving -

equation2

A graph was plotted of h against t2, giving a good approximation of a straight line through the origin, as expected.

t2 (s2) h (m)
0.0576 0.2
0.0841 0.4
0.1394 0.6
0.1654 0.8
0.2178 1.0

graph

Using the trendline function in Excel, a best fit line was added with its function included. The gradient of this straight line, which is equal to ½ g, is 4.91, giving a value for g from this graph - g = 9.82 ms-2.

Further analysis of the graph, using the LINEST function in excel, gave the following uncertainties -

gradient uncertainty in gradient % uncertainty in gradient
4.91 0.33 7%
g (ms-2) absolute uncertainty in g (ms-2)
9.82 0.69

This graphical treatment of the data gives a final value for g using this procedure as -

g = (9.82 ± 0.69)  ms-2

I have included the raw data, graphical treatment and uncertainties in the in the excel file below.

g ball

Model Linear Accelerator⤴

from @ stuckwithphysics.co.uk

Last June, I made a model particle accelerator using a plastic salad bowl. This was courtesy of a great 'cook-along' online CPD via Google Hangouts hosted by a fellow IOP Network Coordinator, Dan Cottle (@blendedlearner). Dan's video can be seen here - https://www.youtube.com/watch?v=4yU7rzgrg6A

This tweet from @ArdAcadPhysics shows the accelerator in action -

After building my own I was left with a huge surplus of aluminium tape, graphite paint and polystyrene balls, so i decided to run some CPD sessions to build it with other physics teachers at my own IOP CPD events.

The last of these sessions was at the ASE Scotland Conference in Aberdeen at the beginning of March. At my session one of the attendees asked it it would be possible to make a linear accelerator along the same lines. At the time the thought hadn't occurred to me, but I had a bit of a think about it over the following weeks until I found a bit of time (and the need to try it out with my own Higher physics class).

Using a bit of guttering my technician had in the stores and a lot of my surplus aluminium tape (I still have half a roll), my colleague Kenny Bell (@ArdAcadPhysics), through a process of trial and error managed to put together a working model of a linear accelerator.

Like the salad bowl accelerator this uses alternate strips of aluminium tape connected to a Van De Graaf generator, one side to the negative dome, the other side to the earth terminal.

IMAG0848

A thin strip of aluminium tape was run along each edge of the guttering to provide common 'rail' connections form each of the terminals on the Van de Graaf. The dome is connected to the side furthest from the camera in the image above. This inturn connects to the end from which the polystyrene balls start their acceleration, which is capped and has aluminium tape over the end cap and the base of the gutter. (Care was taken to avoid having the tape touch the earth rail, which would cause a short circuit.)

 

IMAG0845  IMAG0846

Alternate strips of aluminium tape were connected to each of the edge rails to give alternately negatively charged (connected to the dome) and neutral (earth) electrodes. Initially electrodes of different widths were tried, but this arrangement was unsuccessful.

linac

With the connections made, a polystyrene ball coated in conductive graphite paint was introduced at the capped end of the accelerator and the Van de Graaf generator was switched on.

A bit of tinkering was required to get the ball to accelerate along the warped guttering - achieved mainly by clamping it down to the desk. Further tinkering allowed it to be used to compare the acceleration of larger polystyrene ball with the smaller ones.

More discussion lead to the idea of using the linear accelerator to inject particles into the circular accelerator. Again, Kenny Bell offered invaluable assistance in achieving this.

If you want to use my example as a start point to build your own, please feel free to do so. if you have any questions, please get in touch via the comments below, or tweet @PhysicsDrew.

Response to SQA Consultation on Assessment⤴

from @ stuckwithphysics.co.uk

The SQA have released a consultation on the assessment arrangements for the new qualifications, which can be found here - 

Having blogged recently on the subject of assessment, I have decided to publish my own response.

[SQA questions shown in bold, my responses below]

1. Which subject(s) do you deliver?

Physics

 2. It was intended that Units in new National Courses should have both fewer Outcomes and Assessment Standards and that those Outcomes should be expressed in broader terms than the Units in previous National Courses. This was to give practitioners the freedom to decide how to assess the Units. 

How has this worked in your subject(s)?

Not at all

3. In SQA-produced Unit assessment support packs, three approaches to assessment have been suggested — Combined, Unit-by-Unit and Portfolio.

[detail of approaches omitted here]

What has been the most common approach in your subject/s and why?

unit by unit - staff are incredibly over worked and do not have the time to develop assessment material from scratch, especially when the assessment standards are so opaque. Doing so and ensuring they meet the pre-verification standards is not generally considered to be an easy process, so the most sensible decision is to use the materials prepared and provided by the SQA.

What are the challenges in using the other approaches and why?

the recording and administration of the outcomes and assessment standards achieved for every pupil in every certificate class, sometimes at two levels creates an incredible burden in the unit by unit approach. This simply couldn't become any easier by breaking it up into a larger number of smaller assessment tasks

4. Unit and Course assessment have separate and different purposes in new National Courses.

Is there duplication of assessment across Unit and Course assessment in your subject(s)?

Yes

If yes, please give details:

The UASP materials assess pupils with items that are significantly different to the style of the final exam, using entirely different marking instructions, that punish any and all errors with no credit given for correct part answers. This gives candidates no useful information about there progress and allows for no constructive feedback other than 'the SQA say your answer is incorrect'. This has a huge negative impact on the student. Unit A/B tests use exam style questions, the same marking instructions as the final exam and allow students to get an idea of their progress judged against the same criteria as their final grade will be. This also allows for students to receive constructive feedback to help them to improve.

5. How might any opportunities to use evidence from one assessment to meet one or more of the requirements in another assessment in your subject(s) be achieved?

This already happens in the problem solving component in physics unit assessments. Each of the four strands of PS need only be achieved once across any one of the three unit assessments. This may mean a student only answering one such question correctly throughout the whole course, so is not necessarily a useful approach.

[I'm not convinced I have understood your question correctly - if it doesn't mean what I think it did, I apologise for not having deciphered it correctly]

6. What implications does the requirement to meet all Assessment Standards in a Unit have for assessment and also for re-assessment in your subject(s)?

In physics, only the Knowledge and Understanding (KU) assessment standards have to be met in all units - the problem solving (PS) can be met at any point across any of the unit assessments.

The marking instructions allow no flexibility or partial credit (responses are either correct or incorrect) with the necessity for particular details often making it difficult for candidates to answer correctly, though the essence of their answer is sound. The issue is not the tasks, rather it is the manner in which they are judged.

The difficulty of meeting the requirements is further compounded by the insistence that candidates be given only two attempts. If they are unsuccessful on the second attempt they cannot be allow to continue and be given a third attempt unless in 'exceptional circumstances'.

In general, the assessments are less of a 'hoop to jump through' that they were in the old courses, and more of an 'obstacle to negotiate'. Nor are they are not easy obstacles for many candidates.

7. To what extent have you developed you own Unit assessments?

i) Why did you adopt this approach?

None - I have only corrected the many mistakes and reformatted them into a usable, write on paper. The process of preparing and presenting our own materials for prior-verification presented too great a work load for staff, especially when there was very little guidance given and no guarantee that multiple redrafts and resubmissions might be required. There was no telling what time scale this might involve, and assessments were needed by candidates during their progress through the courses.

8. Have you used digital evidence or e-assessment in the internal assessment of Units in your subject(s)?

No

9. Are there any other ways we could approach the internal assessment of Units in the future?

Yes

If yes, please give details:

Provide e-Assessment that meets your standards, gives credit for partially correct responses, automatically logs elements that are 'passed' to be logged against an SQA candidate number, and is dynamic enough to allow reassessment to be tailored to only the key areas that need to be reassessed for each candidate. Not 'writing-off' candidates after two attempts would be fairer, too. Basically an approach that allows teachers to do their job of teaching, whilst shifting the burden of assessment onto the assessment body.

 

 

 

SQA Consultation on Reforming Assessment⤴

from @ stuckwithphysics.co.uk

In an acknowledgement that there are 'issues' with assessment, the Scottish Qualifications Authority (SQA) have opened a on-line consultation.

Their survey can be found here - https://www.surveymonkey.com/r/TFJQCX2

Having produced a few posts recently on the subject of assessment I have been keen to respond to the survey. Once I have checked to make sure it won't cause any trouble, I shall publish my response in another post.

I would urge every teacher in the country, who has ever expressed any concern over the assessment arrangements for the new qualifications to take the time (it's not quick) to make as full a response to the survey as they are able to.

How we could reform assessment and certification⤴

from @ stuckwithphysics.co.uk

In my recent post 'Why we need to reform assessment', I outlined a number of issues which give me concern over the assessment of SQA National 3-5, Higher and Advanced Higher courses, introduced as part of the delivery of Curriculum for Excellence.

Whilst there may be many teachers who would wish for a return to the simpler assessment arrangements of the Intermediate 1 & 2, Higher and Advanced Higher qualifications of the 'Higher Still' era, which have been replaced by the CfE courses, I feel that one of the major shortcomings of CfE is its failure in doing anything to fundamentally change the nature or the purpose of assessment.

The new system, as with all of its predecessors, places almost the entire value of the certified qualifications on the terminal summative assessment of the course - the exam. Though some courses have significant elements of coursework, and many include an extended project or research task, still the majority of what our students, schools and staff are judged upon happens in a narrow time frame of a few hours at the end of almost an entire year of study. The results of these high stakes assessments supersede those for the individual unit assessments completed during the courses studied, rendering them effectively worthless.

In addition to this, there is the issue of the complexity involved in marking and recording the results of the unit assessments (which I outlined in 'Why we need to reform assessment') which makes it difficult for students to understand whether they have passed units or not and increases the administrative burden on teachers.

In order to overcome these issues I propose the following changes to the methods of assessment and certification.

Unit Assessment via online e-Assessment

The SQA currently makes use of its own system for online e-Assessment, SQA Solar, for a range of courses across Nationals, Higher, HNC etc. Centres and candidates have unique, secure logins ensuring security of the assessment, and the system allows assessments to be scheduled at a time when the student is ready to be assessed.

This system could be expanded to incorporate all unit assessments in all subjects at all levels, and could be set up so that students performance could be recorded against the many individual criteria necessary to achieve a pass in a given unit. Any reassessment required could automatically be tailored to the specific areas not achieved at the first attempt. Given a sufficiently large bank of assessment items, or a sufficiently adaptable format, allowing numerical data to change for calculation-based questions (as it does on Heriot-Watt University's Scholar VLE), it might be possible for students to make multiple attempts at assessments until the required standard is reached.

As the system is fully automated, this would free up teachers' time for teaching and supporting their students learning, rather than using it for the bureaucratic administration of data. It would also reduce the 'data chase' required to ensure that SQA data is kept up to date on school MIS systems for transfer to SQA systems.

'Points' allocation and certification for internally assessed components -

Most courses have individual unit assessments which must be passed by students in order for them to achieve a grade in the final examination. Although these unit passes are included on students' certificates, there is no explicit value placed upon them in comparison to the exam grade achieved. By allocating all components of all courses a number of points at the relevant SCQF level, students could potentially build up points across a number of courses whilst being able to choose whether or not to sit the final examination. This would reduce the 'high stakes' nature of the final examination, and allow for students, departments and schools to be judged and compared over the full range of their performance.

Points allocation for units could be based on the 'size' of the units, whilst exam grades could be allocated points determined by the band of pass. In my own subject, Physics, for example -

N5 - points awarded at SCQF level 5

3 x units, each with 10 points = 30 points,

Exam grade bands - A1 = 30 points, A2 = 25 points, B3 = 20 points, etc

Higher - points awarded at SCQF level 6

2 x full unit, each with 10 points + 2 x half unit, each with 5 points = 30 points,

Exam grade bands as for N5

Revision and separate certification of assessed course 'Added Value' units and 'Assignments' - 

Many courses have an internally assessed 'Added Value' unit, which at N4 has to meet every one of a significant number of individual criteria. Teachers are allowed to provide feedback to students in order to modify their submissions so that these criteria can be met.

The equivalent component of most N5 courses is an externally assessed 'Assignment', a formal report which is completed 'under close supervision' after a period of research which may include practical experimental work. Though guidance is given to students from their teachers, no feedback may be given on the report produced which is sent to the SQA to be assessed. The final mark for the assignment, given out of 20, forms a small proportion of the final score and hence the final grade.

These arrangements make it much more demanding for an N4 student, who may find the task much more challenging than most N5 students. A poorly completed N4 AVU would not meet all of the critera, resulting in the student not meeting the requirements of the unit, and subsequently not receiving an overall award for the course. A poorly completed N5 Assignment carries no such penalty, and would simply give the student a lower final score - without denying the student an overall pass.

Revisions should be made to the assessment of AVU tasks to make them fairer on the students. Perhaps an AVU could be consider to have been passed if a significant proportion of the criteria for the task, say 10 out of 15, were met by the student.

In addition to the significant differences in the assessments of these equivalent tasks across SCQF levels, AVU and assignment tasks are often very similar in related subject areas. This results in significant duplication of effort and repeated assessment of skills across a number of a student's subjects.

By assessing these tasks on a skills basis, rather than within subjects, a single AVU or assignment could be completed by a student studying more than one science, or social subject. Students could choose which subject or subjects their assignments could cover, potentially allowing more meaningful, challenging, inter-disciplinary work to be undertaken. Though this might make the assessment of students' reports more complicated, it might offer an opportunity to make the assessment criteria more flexible, as they are for the Baccalaureate qualifications undertaken by some students in S6. If nothing else, a reduction in the number of these tasks would significantly reduce the workload on students and reduce the SQA who have found it increasingly difficult to recruit sufficient markers for these tasks since their introduction.

I recognise that these proposals would require significant change to our current systems of assessment and certification, and that the Scottish teaching profession has experienced unprecedented change throughout the development and delivery of Curriculum fro Excellence. I further accept that one of the main reasons for avoiding radical change in the exam system has been concern that parents, employers, colleges and universities, might not fully understand the significance of new qualifications. In reality, it could be argued that these groups don't fully understand the significance of the current qualifications system, and haven't done so for a long time, if they ever have at all.

On a superficial level, it is easily understood that a student with an 'A' grade in a qualification is in some way 'better qualified' than another with a 'C' grade in the same subject, and that a student with five Higher passes is 'better qualified' than another with three Highers and two National 5s. But unless one has recently studied a course, or taught it, there is little chance of understanding what knowledge and skills are really involved gaining such a qualification, let alone how that qualification compares with other subjects or other levels.

It is often argued that we need these qualifications to allow universities to choose between applicants for places on their undergraduate courses. Without wishing to belittle this assertion, it does bear comparison to the 'Sorting Hat' in the Harry Potter novels - e.g. 'AAAAB' at Higher being the minimum requirement for a Law degree (Slytherin?). Increasingly, however, universities apply their own assessment requirements (BMAT, UKCAT exams), conduct entrance interviews, or consider applicants on the broader indicators of their personal statements, reducing their reliance on the crude measurement of 'ability' given by exam results alone.

In many ways the awarding of badges by organisations such as the Boy's Brigade or Scouts to indicate the achievements of their members is a much more understandable form of accreditation. Indeed many professional and vocational qualifications are already 'badgified' in this way using industry standards, against which 'badges' are referenced and accredited. Mozilla, the organisation behind the Firefox we browser, support such a system for teachers to award 'Open Badges' to their students using 'open standards' - where the criteria for which the badge is awarded are embedded as meta-data and awarded digitally. These badges can be electronically attached to a student's digital profile via their blog, Google or other online account, and shared with prospective employers, colleges and universities.

Some work has already been undertaken by the SQA to develop this approach to accreditation, outlined in this press release from 2013, with small scale projects being adopted by some FE colleges, including Borders College, for accrediting the work of both students and staff whose CPD is accredited in this way.

Open badges may not solve all of the short comings of our current system, indeed other, better systems may be in use elsewhere, or currently under development. Such a system, if combined with students' unique Glow account could potentially stay with them throughout their schooling and beyond, perhaps even following them beyond further and higher education and into employment. The development of such a 'Scottish Learner's Account', integrating assessment, certification and the accreditation of skills could form the foundation of a truly radical approach to these issues upon which students at all stages could build throughout the 'Lifelong Learning' that lies at the heart of the Scottish Government's ambitions for the future of education.

Why we need to reform assessment⤴

from @ stuckwithphysics.co.uk

Following on from my post back in May 'Do Exams Pass Under CfE?', I have given the issues of assessment and certification some further consideration, which I outlined in my presentation at this year's Teachmeet SLF 'Breakout' event held at CitizenM, Glasgow back in September. This post is an attempt to summarise and explain the issues which cause me, and many other people in education, huge concern and why I believe assessment must be reformed.

As I outlined in 'Do Exams Pass Under CfE?', the system of assessment and certification has remained largely unchanged after the significant changes brought to the Scottish education system by Curriculum for Excellence. Course content may have been reworked in most subjects, with many now including an extended research and presentation task (assignment) which contributes a proportion of the final exam score, but the framework of unit tests and final exam remains at the heart of how students are assessed.

In many ways what has been put into place for the new CfE National 3-5, Higher and Advanced Higher courses, with the unit tests becoming more high-stakes than the NABs they replace - candidates receive only two opportunities to 'pass' these tests unless under 'exceptional circumstances', but cannot receive a grade for the final exam unless all course units have been passed.

In my own subject the old NAB unit assessments, where pupils had to achieve a score of 60% to pass, have been replaced by assessment which are broken down into two main parts -

  • 2.1 Knowledge & Understanding (KU) - which is broken down in to individual Key Areas described in the SQA arrangements documentation. To 'pass' this component students must respond correctly to at least half of the questions - i.e. if there are 14 questions, 7 must be answered correctly. If a student doesn't meet this requirement they can be reassessed, but they need only to attempt questions from Key Areas that they did not 'pass' in their first attempt. If they do not succeed at a second attempt, they have not met the minimum standard and cannot progress unless there are 'exceptional circumstances' which would allow a third attempt.
  • 2.2 Problem Solving (PS) - which is further broken down into four skills - Predicting, Selecting, Processing and Analysing. In these tasks student must correctly respond to at least half of each type of question in order to 'pass' that problems solving skill - i.e. if there are 6 processing questions, 3 must be answered correctly. Students who don't meet this requirement for each of the problem solving skills do not need to be reassessed, as other unit assessments will allow opportunities to demonstrate the same skills. Each skill need only be 'passed' on one occasion across each of the three unit assessments.

Teachers giving these assessments must record each students performance in terms of 'pass' or 'fail' not just for each unit, but for KU and each of the four PS skills for each unit. This applies to courses at all levels from National 3 to Advanced Higher. The collating and recording of students' progress through these assessments is both complex and time consuming. However, more is required both of students and teachers.

In all courses except N3, students achieving passes in KU across each unit, and across each of the four PS skills must also complete two further tasks before they can sit the final exam -

  • Outcome 1 - practical experimental report. This tasks is broadly similar to the LO3 task in the old Higher course where students perform an experiment and write up a detailed report meeting criteria set by SQA. This task is broken down into a number of individual outcomes, each of which can be achieved in any number of different activities. Students need only achieve each individual outcome once across the whole course - these must also be recorded by the teacher.
  • Research task - The detailed requirements vary between courses, but in general this is an extended research task which is conducted by all students.
    • At N4, the 'Added Value Unit' (AVU), which is internally assessed, contains a number of individual criteria all of which must be met in order for the student to 'pass' the task and achieve a course award. Students may receive feedback from teachers to ensure all the criteria are met.
    • At N5, students conduct an 'Assignment'. This research task, which may or may not include experimental work, requires them to collate information as they progress through the task. At the end of the 'research phase' of the task, students are required to compile a report, including items demonstrating a variety of information processing and presentations skills 'under a strict degree of supervision'. The student can not be given any feedback on their report, which is sent to SQA for external assessment. The assignment report is given a mark out of 20 which counts towards the final grade.
    • At Higher, students complete the 'Researching Physics' half unit within the course. This is assessed internally by teachers against criteria set by SQA and must include evidence of both research and practical work conducted by the students. The Researching Physics unit can be used as the basis for the students' remaining assessment task - the 'Assignment'. As for the N5 assignment, students must compile a report 'under a degree of strict supervision' demonstrating a number of information processing and presentation skills, and no feedback can be given. The completed report is sent to the SQA for external assessment with the mark out of 20 counting towards the final grade.
    • At Advanced Higher the arrangements are similar to those for Higher, though pupils conduct extended practical work as part of their 'Investigation'. This is assessed both internally as a half unit, and externally through their investigation report which is compiled by the student through out the task. Students are allowed to be given feedback at all stages throughout this task.

Only when a student has successfully completed all of the internally assessed components of their course are they allowed to sit the final examination. At the end of all of this detailed and highly involved assessment the final grade awarded to the student will depend mostly on their performance in during the two to two-and-a-half hours spent in the examination hall, with no recognition at all of the tasks that have been successfully completed on the way.

Bearing in mind that students may be following as many as seven N5 courses, in which various other combinations of assessment tasks and arrangements may be in place, there is no doubt that the new CfE courses have significantly increased the burden of assessment on both students and teachers. This is clearly unsustainable and an alternative must be found.

In my next post, I will detail my proposals for reforming the process of assessment to reduce some of this burden and the certification of courses to allow greater recognition of the achievements students assessments throughout their courses.