Monthly Archives: May 2018

Postcard doodles⤴


It’s been a busy month, and inevitably I’ve not done everything I planned to. But one thing that I have done over the last couple of weeks is to start doing my daily doodles on postcards. Of course, because I sometimes find times to doodle at work and other times at home, I only have half my stack here. But here’s some of the ones I like from this month – now safely stashed away to send in the future.

Minister comments on 2016-17 widening access statistics⤴

from @ Engage for Education

Commenting on the publication of the Scottish Funding Council’s Report on Widening Access 2016-17, Further and Higher Education Minister Shirley-Anne Somerville said:

“These figures are a stark reminder of why this Government was right to take the action we did on widening access.

“They show that in the four years up to 2016-17 nothing had changed and, on their own, universities were not making improvement in increasing the percentage of students from the 20% most deprived areas.

“The Commission on Widening Access reported in March 2016, by which time the vast majority of students had already applied for the 2016-17 academic year. So these figures provide a baseline from which to judge how successfully the Commission’s recommendations will be implemented – at a national level but also at an individual institution level.

“More recent figures from UCAS show a 13% increase in the number of Scots from disadvantaged areas getting a place to study at a Scottish university in 2017-18. So we expect to see demonstrable progress next year and beyond.”


  • The Scottish Funding Council report on Widening Access 2016-17 can be viewed on the Scottish Funding Council website.
  • The data shows that, in 2016-17, 13.8% of full-time first degree entrants were from SIMD20. For all Higher Education entrants (including sub-degree and college), 17.7% are from SIMD 20.
  • The Commission on Widening Access target, accepted by the Scottish Government is, by 2030, students from the 20% most deprived backgrounds should represent 20% of entrants to higher education, with equality of access seen in both the college and university sector.
  • To drive forward progress to the 2030 goal the interim targets are:
    – 16% of full-time first degree entrants to Scottish universities from the 20% most deprived areas (SIMD20) by 2021;
    – 18% of full-time first degree entrants to Scottish universities from the 20% most deprived areas (SIMD20) by 2026;
    – By 2021, an individual institutional target for universities  of 10%.
  • The Commission’s final report was published in March 2016. The main UCAS deadline for the 2016-17 academic year was January 2016.
  • The UCAS 2017 End of Cycle report (published December/January) showed a 13% increase in the number of Scots from the most deprived communities getting places to study at a Scottish university in 2017 (4,565 in 2016 to 5,170 in 2017).

The post Minister comments on 2016-17 widening access statistics appeared first on Engage for Education.

Episode 7 – Unlocking Leadership Potential⤴

from @

In this episode of EduBlether, we interview @joycematthews_ about Unlocking Leadership Potential. We also have our usual features: in the news, we recommend and inspired by. Please check out and rate us on iTunes.

Listen on iTunes or via soundcloud.

Osiris Educational⤴

from @

Osiris Educational are a professional learning body offering some fantastic ‘cutting edge’ programmes. Check them out here:

RE: 💬 Three Ways to Keep Track of Students’ Blog Entries⤴

from @ wwwd – John's World Wide Wall Display

Three Ways to Keep Track of Students’ Blog Entries by Aaron DavisAaron Davis (
This is one of the big challenges with student blogging. When I used Edublogs in the classroom, I would moderate everything, therefore I would know what is being posted that way. However, I have been wondering lately about the idea of creating a formula in Google Sheets using IMPORTFEED where each n...

In Glow Blogs, we have the Glow Blogs Reader (Follow Blogs)

The Glow blogs reader allows you to ‘follow’ a number of Glow Blogs. In following blogs you will be able to see which of these blogs has been updated in your dashboard rather than have to visit each site to check for updates.

Useful because 1. it allows you to follow private blogs which an RSS Reader will not and 2. For teachers unfamiliar with RSS and readers it will be a lot simpler.

It doesn’t have the facility to mark off or record posts that you have commented on which is of interest to Aaron.


Testing Times for Scotland⤴

from @ School Leadership - A Scottish Perspective

'These are not high stakes tests; there will be no 'pass or fail' and no additional workload for children or teachers.' John Swinney 25/11/16

I start this look at the introduction of the Scottish National Standardised Assessments (SNSAs) with  statement above from John Swinney, the Deputy First Minister and Cabinet Secretary for Education and Skills, made when he announced the contract for our new standardised testing had been awarded to ACER International UK, Ltd. This organisation is a subsidiary of the Australian Council for Educational Research (ACER), whom have been responsible for the development of the National Assessment Program - Literacy and Numeracy (NAPLAN) regime of high-stakes testing in the Australian system since 2008. I also believe they were one of a very short list of providers who tendered a bid for this contract.

I was drawn to this statement as I reflected on many of the responses I have received after I put out a request on Twitter asking for people to get in touch about their experiences with the new standardised tests as they are introduced across our schools. I sit on the board of Connect (formerly the Scottish Parents and Teachers Council) and the issue of the new tests had been raised at a recent board meeting. I said I would gather more information for Connect, so that we were able to offer advice to parents on the new testing regime, and hopefully allay some of their fears.

What quickly emerged was a very mixed picture in how the tests were being used across Scotland, but there was a commonality in the types of experiences children, teachers and schools were having, and it very much flew in the face of Mr Swinney's assurances given at the outset of their development.

'regarding SNSAs…Where do I start? I have had 3 children I have spent all year working with to build self-worth and self-belief, comment that they are ‘no good’, ‘useless’ and then cry. I have had one child who decided to guess most of the numeracy questions, and got them correct! (Lies, damned lies and statistics!) Most frustratingly, I am a class teacher administering the tests in class using 2 ipads and a desk top. Class of 27=81 tests. Huge impact on learning and teaching as you can imagine. With so many children suffering from low self-esteem and an increase in mental health issues, why is this happening? I truly despair.’

This was a response from a primary school class teacher, one of many who got in touch, expressing their concern with not only the impact on learners, the learning going on in their classrooms whilst testing was taking place, and the implications for their workload. I had a number of similar responses from teachers, school leaders and members of senior management teams.

 ‘ I took some of our P1s for their assessments today. We have 3 P1 teachers, who stayed with their classes, while 3 class teachers 2 learning support teachers and 3 PSAs spent all day doing the assessments 1-1, roughly 20-30 mins per child per literacy test, plus 15-20 mins for numeracy. Aside from the straight salary cost there, imagine the opportunity cost! The tests themselves are (obviously) far too narrow to give a decent picture of a child’s learning, but also seem generic rather than based on the taught P1 curriculum (despite the Scottish accent). The (now legendary) passage on hummingbirds is just ridiculous, I had one wee girl who was becoming so visibly crushed by it that I told her we would just leave it – I couldn’t let her suffer for something so unrealistic. Most of the children were exhausted by them, especially literacy, and certainly schools shouldn’t  have P1 children doing both in one sitting. I have 2 primary age daughters and if they were still in P1 I’d be withdrawing them from these. My opinion is that all the planned primary tests are at best unnecessary and possibly detrimental, but the P1 test seems to be actively harmful and a phenomenal drain on resources to no obvious benefit to the learners.’

This, from another class teacher, backed up what many colleagues were saying about the impacts for learners and teachers, as well as wider school, workloads. This was the first response that also started to query wider system issues of the new testing, such as the cost, the appropriateness of the content and the emotional impact on very young learners. One or two indicated that they felt most of their children weren't unduly stressed by the tests, they were able to present them in a fun way as a quiz or some other way, but they still queried some of the content, the usefulness of the outcomes and the disruption and impact being caused for teachers, children and schools.

 ‘Highlights of the P1 SNSA reading test included a passage on hummingbirds! Hummingbirds??? Vocabulary included hover and perch (and backwards). It also included a question asking what an alternative word for ‘beak’ was. So testing general knowledge then? It is impossible to do with a class of P1. SMT now doing individually, with all 70 plus P1s!!! Aaarghhh!!’

was a reflection of some of the frustrations felt by one headteacher. She went on to add,

‘seems to be the only game in town. I really question the validity of the ‘standardisation’ too. Even within my cluster we have some folks using iPads, some PCs, some testing all day, some only mornings, some individually and some whole class, some folk reading to their p1s instead of using the voice and doing the clicking because their mouse skills are not sophisticated enough. And don’t get me started on the IT and wifi capacity!!’

‘Who does my work while I collect meaningless data for HQ/Scot Govt?’

It would seem that many schools had resorted to senior management teams, Support for Learning teachers and other support staff, where there were any left, to carry out the testing, recognising the impossibility of teachers being able to deliver these tests, especially the P1 ones, whilst still teaching a class. The lack of equipment, and poor ICT systems were cited by many as a frustration and cause of more stress for teachers and young learners.
Another headteacher sent me the following,

‘The torture continues. P7 writing assessment (which in fact is assessing punctuation, grammar and spelling so therefore just the tools of writing) has questions where children asked to correct the spelling of a word. One of my enterprising P7s worked out that if you right click on the answer, the computer will tell you if its correct! Brilliant!’

This story caused a flurry of Tweets and incredulity on Twitter, and beyond, and also pointed to a concern raised by many, that these tests of literacy and 'numeracy' did neither. What they assess is some of the skills required to demonstrate literacy and numeracy, but they were no a test of either literacy or numeracy.

The sense of frustration felt by one Support for Learning teacher is palpable in her response.

‘ SNSA aaaaahhhhhh! As you can imagine this is an extra to what we are all doing. Local authority has decided to do them in May, which is probably a good time of year.
Getting them all logged on, finding the website (the long name) and saving it in favourites takes time in itself. Logging onto the website is laborious for P1 as adult needs to do it as they are so long. OK for most P4 and P7.
P1 pupils need good competent keyboard/generic skills to complete assessments (click and drag, do not double click, etc.) Our screens do not show the ‘Next’ key unless pupils scroll down to find it.
P1 pupils have a lot of pointer movements to make every time they go to the next screen (go to top left to read out instructions then read out questions and possible answers, now find the ‘next’ button etc.)
The guidance says give pupils the same support they would get in class – this is quite subjective. Do you give them the support they DO get or what you would like them to get if there were more staff?? As a teacher I am unsure what is being assessed in some areas. For example is the reading assessing comprehension or decoding?
Teachers cannot do sample assessments.
No text to speech option for P4 and P7 pupils – for pupils who are still developing skills in decoding (only parts of the P1 have speech option)
Font is very small on P4 and P7 assessments – we are all having to peer at the screen.
P1 reading requires them to read or hear about 4 sections of a story before they answer questions – lots of memory rather than find the answer in the text.
Lots of words and names used in P1 assessments that are not decodable using Alphabetic Codes taught in P1.
P1 pupils need lots of support to get through the practice and 2 assessments. We do not have time to do 1 to 1 support so independent working through them digitally may not give correct measure against benchmarks.
‘I was demented this morning. Getting P1s set up. Broken headphones, notebooks with no audio! Eventually got them all working independently and keeping them happy. No idea how they have done. What a palaver! Glad I am retiring early after next session.’

She raises more issues about the validity and content of the tests, all of which have supposedly been tested and piloted extensively before their introduction, and the technical issues that teachers and schools are having to deal with. Since the introduction of such on-line testing was first mooted many of these concerns had been raised by teachers and schools, but it would seem that not a lot of heed was taken of the concerns expressed.

Another class teacher pointed out yet another technical issue that surely could have been resolved before the tests went 'live.
‘One of the problems we faced is that the usernames include the child’s middle names, so some of our kids are taking a long time to log in. One pupil has 5 middle names, time was up and he was still trying to log in.' 
Whilst another articulated a question many were asking,
‘How much is this costing? I have no jotters or whiteboard-pens, general basics to do my job …Ah, priorities. Hang them out to dry!’

It is clear that many local authorities are asking/telling their schools to administer the tests towards the end of the school year, i.e. May/June, which is a very busy time in schools anyway and does not allow teachers to use them in a properly diagnostic way, but some have taken a different approach.

‘In our small cluster, we have analysed the SNSAs our P7 pupils sat in October. Teachers used the results diagnostically to aid planning, but we have looked at what the trends for cluster mean for secondary. Many of the results haven’t changed judgements about achievement of a level but some clear trends have emerged, which we will address for next session.’

However, this has allowed some to question the validity of the 'standardisation' claimed for the tests by the government and its supplier. What is clear is there are a range of approaches and experiences happening across Scotland, some of which bring into question the validity of outcomes produced by the testing software.

A DHT wrote,

‘Looked at P1 results with CT. Children are ranked Low, Medium or High. All exactly where CT put them at beginning of the week. A week of quality teaching time lost and stressed pupils and teachers … not to mention the cost of it all!’
which really does bring into question the added value to teachers' professional judgement from these assessments. If they are not telling teachers or schools anything they do not already know about learners, what then is their purpose, and at what cost? This was reflected in the latest comment I have received from a teacher.

‘Have just attended the phase B SNSA training. All about the data. We were told that the Scot Gvmt will not have access to the data. It belongs to the school and their LA. We were told again it is NOT high stakes, but there to inform the teachers. However she then kept telling us that HMIe will ask SMT what are they doing about areas flagged up as low. Kept referring to how it will show how PEF interventions are closing the gap and raising attainment. We pointed out that SNSA is done at P1, 4, 7 and S3 only. Unless you have data before and after a PEF intervention how can you possibly say what the impact is from SNSAs? The reports/graphs were so busy I defy anyone to have the time to fully interrogate them for each pupil as we were being shown. It also does not produce block graphs for year groups less than ten pupils, which means that many small schools cannot get them. We also said we do not see how they can be standardised assessments if LAs can do them at whatever time of year they choose.’

On the last point, it would also seem that schools are administering the tests in a myriad of ways, and with varying levels of support for learners. All this brings into question the validity of the 'results' across schools, local authorities and further afield. Observing from outside now, it would seem to me that the Government rhetoric around the tests 'not being high stakes' is being ignored by local authorities, who are making them, alongside the benchmarks, very much high-stakes and how they are judging schools. This is exactly the scenario that played out in Australia with NAPLAN tests, England with SATS, and other countries that have gone down similar routes. In all these countries, the early talk was of the tests supporting 'teacher professional judgement', but they soon mutated into high-stakes accountability measures. Scotland is heading the same way.

Some of the tweets I received from teachers included the following selection:
‘Accountability. Pure and simple. In no way will this benefit our learners.’
‘If we can’t clearly decide the nature of the question it shouldn’t be used – a reading passage should have all the answers. Anyway the whole set-up is simply ScotGovt data trawling not promoting best practice.’
‘The maths question about how many Tuesdays in a particular calendar month made my heart sink. Far too difficult and not reflective of Early level,’
‘This is for P1!! Its not reflective of early level literacy curriculum. The hummingbird passage is beyond the expected usual level by the end of P1. That question in particular totally relies on children’s own prior knowledge of birds, there were no contextual clues.’
‘AND it was in the norming study completed in march when I know that HTs specifically said that that particular passage was not appropriate for P1, when asked for feedback re the assessment.’
‘An all so a gorgeous and very bright P1 could say, ‘I am not good, am I’ after trying really hard to work out the words in the ‘hummingbird’ passage. Well done the system – a curious and excited learner demoralised!’
‘I have just had a flash back to the Counting Rhymes in an African Village paper from 5-14 test bank. Is the purpose of spending all this money to help teachers know how chn are progressing? That will be a great help because how would teachers ever know otherwise??? ‘
‘Can parents ask for their child not to do this?’
As things stand, I have hundreds of responses to this request for information about the tests, and whilst I recognise this is anything but a scientific examination of SNSAs, I do think there is enough already for the profession and parents to be concerned about. Regarding that last question in a Tweet, the tests are not compulsory or mandatory, the Government's own advice recognises this. However, some schools and local authorities are presenting them as 'mandatory' to parents. I would argue, that even were they designated as 'mandatory' parents would still have the right to withdraw their children. After all they are their children and if they think the impacts of such testing are harmful to their wellbeing, then they should withdraw them.

Just like the tests themselves, my request for thoughts around them provides us with a snapshot in time, and quite early in the timeframe of their introduction. However, I think there are indications of significant issues that need to be addressed by Scottish government, local authorities and schools. I have summarised these as follows;
Assessments aren’t really assessing literacy and numeracy, just bits of the skills required to be literate and numerate
Tests not assessing the taught curriculum in Scotland, especially at Early Level
They don’t reflect the principles and practice of CfE
Technical problems within the tests themselves
Workload for teachers and schools, and time being swallowed up in their administration
Lack of, or poor, hardware and infrastructures in schools to administer tests
Lack of ‘standardisation’ in how they are being applied, used and supported – a very mixed approach across the country
Stresses for children, especially p1s, and staff
When and how tests are being delivered is being heavily dictated by LAs
Are the tests actually telling the teachers anything they don’t already know, and at what cost?
Headteachers telling parents tests are mandatory, or not even informing parents they are taking place
The validity of the tests, how they will be interpreted, and how they will be used by schools, LAs and Gov

Does the categorising learners as 'Low' 'Medium' and 'High' promote setting, labelling and further disadvantage?
I think there are big questions for everyone in the Scottish system to ask and seek answers to. The cost of the introduction of the SNSAs is huge, running into millions of pounds, much of which are 'hidden' and are being absorbed by schools and local authorities. The big question is, is it worth it? The EIS said it would oppose the carrying out of tests if they began to skew the curriculum and put undue extra pressure on their members. I would suggest both of those are already beginning to happen. Teachers and school leaders need to be asking, as suggested by Mr Swinney himself, do you have more freedom to focus on learning teaching with the introduction of the tests? In 2017 he said 'When Scotland set out to reform our school curriculum, a critical question was how we break free of the top-down diktats that dominated Scottish school education.' He gave teachers and schools 'permission' to challenge anything that took them away from the core business of learning and teaching. Perhaps it is now time to make some of those challenges!
If you don't think it is worth it, just read this tweet again,
‘An all so a gorgeous and very bright P1 could say, ‘I am not good, am I’ after trying really hard to work out the words in the ‘hummingbird’ passage. Well done the system – a curious and excited learner demoralised!’
Is that really want for our very youngest learners? I hope not! Perhaps we are all being tested?

GDPR  – a change to data protection law⤴

from @ Engage for Education

The General Data Protection Regulations comes into force on May 25, 2018, giving individuals more control over their personal data, and ensuring that organisations collect and process that information properly and securely. To find out more about GDPR, visit the Information Commissioner’s website.

We collect personal data on this blog when you post a comment or subscribe to email updates.

For comments we collect your name (or username) and email address. If you subscribe to email updates we only collect your email address. You can unsubscribe to email alerts by clicking the unsubscribe link on the email. This will immediately delete your personal details.

To find out more about how we handle your personal data, please read our updated privacy policy.

The post GDPR  – a change to data protection law appeared first on Engage for Education.

Progress report for Educational and Occupational Credentials in⤴

from @ Sharing and learning

[This is cross-posted from the Educational and Occupational Credentials in W3C community group, if you interested please direct your comments there.]

Over the past few months we have been working systematically through the 30-or-so outline use cases for describing Educational and Occupational Credentials in, suggesting how they can be met with existing terms, or failing that working on proposals for new terms to add. Here I want to summarize the progress against these use cases, inviting review of our solutions and closure of any outstanding issues.

Use cases enabled

The list below summarizes information from the community group wiki for those use cases that we have addressed, with links to the outline use case description, the wiki page showing how we met the requirements arising from that use case, and proposed new terms on a test instance of (may be slow to load). I tried to be inclusive / exhaustive in what I have called out as an issue.

1.1 Identify subtypes of credential

1.2 Name search for credential

1.3 Identify the educational level of a credential

1.4 Desired/required competencies

1.6 Name search for credentialing organization

1.8 Labor market value

1.11 Recognize current competencies

1.13 Language of Credential

2.1 Coverage

2.2 Quality assurance

2.5 Renewal/maintenance requirements

2.6 Cost

3.1 Find related courses, assessments or learning materials

3.3 Relate credentials to competencies

3.4 Find credentialing organization

4.2 Compare credentials

  • Credentials can be compared in terms of any of the factors above, notably cost, compentencies required, recognition and validity.

4.3 Build directories

1.5 Industry and occupation analysis

1.7 Career and education goal

1.10 Job vacancy

3.2 Job seeking

Use cases that have been ‘parked’

The following use cases have not been addressed; either they were identified as low priority or there was insufficient consensus as to how to enable them:

1.9 Assessment (see issue 5, no way to represent assessments in

1.12 Transfer value: recognizing current credentials (a complex issue, relating to “stackable” credentials, recognition, and learning pathways)

2.3 Onward transfer value (as previous)

2.4 Eligibility requirements (discussed, but no consensus)

3.5 Find a service to verify a credential (not discussed, low priority)

4.1 Awarding a Credential to a Person  (not discussed, solution may be related to personal self-promotion)

4.4 Personal Self-promotion (pending discussion)

4.5 Replace and retire credentials (not discussed, low priority)

Summary of issues

As well as the unaddressed use cases above, there are some caveats about the way other use cases have been addressed. I have tried to be inclusive / exhaustive in what I have called out as an issue,–I hope many of them can be acknowledged and left for future contributions to, we just need to clarify that they have been.

  • Issue 1: whether EducationalOccupationalCredential is a subtype of CreativeWork or Intangible.
  • Issue 2: competenceRequired only addresses the simplest case of individual required competencies.
  • Issue 3: whether accreditation is a form of recognition.
  • Issue 4: the actual renewal / maintenance requirements aren’t specified.
  • Issue 5: there is no way represent Assessments in
  • Issue 6: there is no explicit guidance on how to show required learning materials for a Course in

There is an issues page on the wiki for tracking progress in disposing of these issues.

Summary of proposed changes to

Many of the use cases were addressed using terms that already exist in The changes we currently propose are

Addition of a new type EducationalOccupationalCredential

Addition of four properties with domain EducationalOccupationalCredential:

Addition of EducationalOccupationalCredential to the domain of two existing properties (with changes to their definition to reflect this):

Addition of EducationalOccupationalCredential to the range of three existing properties:

The post Progress report for Educational and Occupational Credentials in appeared first on Sharing and learning.

Using digital portfolios to share learning experiences and skills progression⤴

from @ Education Scotland's Learning Blog

Tarbolton Primary School in South Ayrshire use ‘Seesaw’, a student driven digital portfolios, for learners from to instantaneously share and record learning experiences and achievements from within and out with school.

Children and young people use mobile devices to  evidence their learning,  upload personal targets and reflect on their progress .  This is then continuously shared with their teachers and parents in order to review their learning.

Seesaw is also used for uploading homework, sharing letters or information as well as daily communications.

Lynsey Bradford, PT at Tarbolton Primary School says:

“Seesaw has radically changed how we log our pupils learning journeys. It is instant and accessible from all devices and ticks the digital platform buttons for all children.  They want to see and share their learning and achievements now and this app allows them to do that in a safe and secure way.”

Read more about how the school has implemented the tool across the school and how it plans to extend its use into early learning and childcare.: Interesting Practice in Skills DYW – Profiling_Seesaw tool