Tag Archives: Feedback

Feedback and more with Forms⤴

from @ ICT for Teaching & Learning in Falkirk Primary Schools

FormsGathering feedback, taking quizzes to reinforce learning, undertaking surveys of views, signing up or registering for an activity – just some of the ways forms can be used by schools. And now there is the option to use Microsoft Forms – available as a free online tool which uses a Microsoft Office 365 account (available to all Glow users) to set up the form either by going to https://forms.office.com or, if already logged into Office 365, via the Forms tile in the office 365 navigation tiles waffle.  Office Forms can be created by either learners or educators.

Forms work nicely on any smartphones, tablets or PCs. Setting up requires the creator to be logged in to Office 365 but those completing the created form can be completed by anyone without requiring any kind of logging in (if that setting is chosen by the form creator), or they can be anonymous (if that is the setting the creator of the form wishes to use), or if they wish to restrict responses to their class and to ensure their identity they can use the login details of office 365 users too (if that’s how the creator of the form wishes the form to be completed). So the form creator gets the choice to suit the purpose and audience of their form.

Feedback is immediate, real-time, to the form creator and the results can be displayed in different ways to suit the need of the form creator.

For Sway users you can embed a form created with office Forms live in a Sway presentation information can be shared about a topic being studied and a quiz included alongside the content.

Creating your form

  1. Office365waffleEither go to https://forms.office.com and log in with your Office 365 account (for Scottish schools that will be your Glow account) or, if already logged into Office 365, choose the Forms tile in the office 365 navigation tiles waffle.
  2. Click on + New to start creating your new form (you can click on the title of any previously created form in order to edit that, and if you wish to base a new form on an existing form you can click on the … ellipsis to the right of the form title and choose copy – then you can edit the copy to create a new version.
  3. addformJust click on “Untitled form” to edit the name of your form, and click on “Enter a description” to add explanatory text as you may wish to include to explain the purpose of the form and perhaps mentioning the intended audience. Then click “+ Add question
  4. questiontypesChoose the type of question.There are five types of answer formats:
    • multiple choice questions (where you can choose to accept only one answer or multiple responses)
    • free-text (and you can choose either short or long text)
    • ratings (you can choose number or star rating)
    • quiz-questions (where you can provide immediate feedback to anyone filling in the form as to whether the respondent gave the correct answer or not (click on the tick icon to indicate which answer would be the correct answer – and just click on the speech-bubble icon to add comments to any response choice, which may give encouraging comments or suggestions for what to do next in response to the answer given, or any kind of feedback you wish to display when a particular choice is chosen)
    • date-input
  5. You can choose whether there can be multiple responses or only one answer accepted, you can require that specific questions have to be answered before a user can complete the form, and by clicking on the  …ellipsis you can choose whether a subtitle (which could provide explanatory text for each question) is displayed, and whether you wish to shuffle the order of questions so that each time someone sees the form the questions are displayed in a random order.
  6. Add as many further questions as you wish. You can re-order the questions by clicking on the upward or downward facing arrows above each question, and you can copy an existing question (and edit that copy), or delete an existing question.

Previewing your form

mobilepreviewformTo see what the form will look like for people about to fill it in you can click on “preview” at the top navigation bar. You can see how the questions will be laid out on a computer, and you can also choose to see how it will look on a mobile device.

Sharing your form

Once the form is complete click on “Send form” – this will open a side panel with various choices. It will provide a link to share with those you wish to respond to the form. It will create a QR code for quick scanning by users using a mobile device, and it will provide html embed code if you wish to embed the form within a website page or blogpost. This screen also gives you the option to choose who will be able to fill out the form – you can choose only people within your organisation (for Scottish schools using Glow that would be Glow users only), and within that you can choose whether or not to record the names of those responding in the results, or you can choose to make the form available to anyone with the link (where no sign-in will be required for people responding to the form).

If you click on “See all settings” at the foot of this side panel you will get further choices:

Looking at the results of your form

Responsesscreen

When you wish to look at the responses to a form you have shared then simply open the form and click on the responses tab along the top of the screen. You will get an overview of the number of respondents, the average time taken to complete by respondents, and whether the form is still active or expired 9if you’d set it to have a deadline). There is also the option to download to a Microsoft Excel spreadsheet (which comes complete with auto-filter drop-downs to easily sort the information generated to suit your needs).

Example forms

FormLearningHow did you get on with your learning this week? – this form is a mock form just to show how a form might be used for a teacher to get feedback from learners in their class to better support them. This example is based on the form created by Fiona Johnson, headteacher at Kilmartin Primary School in Argyll and Bute, but this link is purely an example so anyone can try it. Similarly here is another mock form (also based on the form created by Fiona Johnson as headteacher at Kilmartin Primary School in Argyll and Bute) – “How did you get on with your learning today?” – feel free to give it a try.

So what have people said about Office Forms?

StevenPayneFormsSteven Payne, an educator in Western Australia, shared the results of a mock use Microsoft Forms – showing the results, and the way in which they can be displayed, which the creator of the form can see once respondents have completed the survey.

Jim Federico commented in a tweet that Microsoft Forms being built into Office 365 for Education means no add-ins are required, and includes question types which auto-grade.

TestingWithOfficeFormsKurt Söser, an educator in Austria, has provided a step-by-step guide to his experience setting up a quiz with Microsoft Forms and using it with his learners.

VicentGadeaFormsVicent Gadea, an educator in Spain, described co-assessment using Microsoft Forms “1st time was complicated then was very powerful for us.”

Zelfstudforms

Koen Timmers, an educator in Belgium, has described in a step-by-step guide, illustrated with screenshots, how to set up a form using Office Forms, and shared what the responses look like for a form he created.

Making use of Forms in the classroom

There is a range of online form tools available, each of which can generally be used in similar ways, so it can be helpful to look at how others have used these tools when thinking about how online forms can support classroom activity.

DavidAndradeFormsChad Raid wrote about the use of forms on David Andrade’s Educational Technology Guy blog – some of which may be applicable in different educational scenarios. Obviously in any use of forms the issue of data security is paramount and guidance from school or local education  authority as to what can, and what must not, be requested via a form would clearly be essential.

 

#100wordTandL Irresistible Feedbac⤴

from @ Pedagoo.org

Research says a fraction of our feedback to students has impact on learning. Knowing this ought to make us look up from our marking labours and try to work out where we might be wasting time. Using a ‘feedback wall’ is immediate and irresistible. Set a challenging discussion-based activity for groups. While they talk, you […]

The power of the red pen!⤴

from @ Pedagoo.org

As a teacher I value pupil voice and understand the importance of quality feedback which needs to be more of a conversation than a statement. In practice though it can be difficult to achieve this without it becoming unmanageable. One change to my teaching practice this week has really made a difference to the quality […]

QAA Scotland Focus On Assessment and Feedback Workshop⤴

from @ Sharing and learning

facebooktwittergoogle_plusredditpinterestlinkedinmail

Today was spent at a QAA Scotland event which aimed to identify and share good practice in assessment and feedback, and to gather suggestions for feeding in to a policy summit for senior institutional managers that will be held on 14 May.  I’ve never had much to do with technology for assessment, though I’ve worked with good specialists in that area, and so this was a useful event for catching up with what is going on.

"True Humility" by George du Maurier, originally published in Punch, 9 November 1895. (Via Wikipedia, click image for details)
“True Humility” by George du Maurier, originally published in Punch, 9 November 1895. (Via Wikipedia)

The first presentation was from Gill Ferrell on electronic management of assessment. She started by summarising the JISC assessment and feedback programmes of 2011-2014. An initial baseline survey for this programme had identified practice that could at best be described as “excellent in parts” but with causes for concern in other areas. There were wide variations in practice for no clear reason, programmes in which assessment was fragmentary rather than building a coherent picture of a student’s capabilities and progress, there not much evidence of formative assessment, not much student involvement in deciding how assessment was carried out, assessments that did not reflect how people would work after they graduate, policies that were more about procedures than educational aims and so on.  Gill identified some of the excellent parts that had served as staring points for the programme–for example the REAP project from CAPLE formerly at Strathclyde University–and she explained how the programme proceeded from there with ideas such as: projects agreeing on basic principles of what they were trying to do (the challenge was to do this in such a way that allowed for scope to change and improve practice); projects involving students in setting learning objectives; encouraging discussion around feedback; changing the timing of assessment to avoid over-compartmentalized learning; shifting from summative for formative assessment and making assessment ipsative, i.e. focussing on comparing with the students past performance to show what each individual was learning.

A lifecycle model for assessment from Manchester Metropolitan helped locate some of the points where progress can be made.

Assessment lifecycle developed at Manchest Metropolitan University. Source: Open course on Assessment in HE.
Assessment lifecycle developed at Manchester Metropolitan University. Source: Open course on Assessment in HE.

Steps 5, “marking and production of feedback” and 8 “Reflecting” were those were most help seemed to be needed (Gill has a blog post with more details).

The challenges  were all pedagogic rather than technical; there was a clear message from the programme that the electronic management of assessment and feedback was effective and efficient.  So, Jisc started scoping work on the Electronic Management of Assessment. A second baseline review in Aug 2014 showed trends in the use of technology that have also been seen in similar surveys by the Heads of eLearning Forum: eSubmission (e.g. use of TurnItIn) is the most embedded use of technology in managing assessment, followed by some use of technology for feedback. Marking and exams were the areas where least was happening. The main pain points were around systems integration: systems were found to be inflexible, many were based around US assumptions of assessment practice and processes, and assessment systems, VLEs and student record systems often just didn’t talk to each other. Staff resistance to use of technology for assessment was also reported to be a problem; students were felt to be much more accepting. There was something of an urban myth that QAA wouldn’t permit certain practices, which enshrined policy and existing procedure so that innovation happened “in the gaps between policy”.

The problems Gill identified all sounded quite familiar to me, particularly the fragmentary practice and lack of systems integration. What surprised most was the little uptake of computer marked assessments and computer set exams. My background is in mathematical sciences, so I’ve seen innovative (i.e. going beyond MCQs) computer marked assessments since about 1995 (see SToMP and CALM). I know it’s not appropriate for all subjects, but I was surprised it’s not used more where it is appropriate (more on that later). On computer set exams, it’s now nearly 10 years since school pupils first sat online exams, so why is HE so far behind?

We then split into parallel sessions for some short case-study style presentations. I heard from:

Katrin Uhilg and Anna Rolinska form the University of Glasgow about the use of wikis (or other collaborative authoring environments such as Google Docs) for learning oriented assessment in translations. The tutor sets a text to be translated, students work in  groups on this, but can see and provide feedback on each other’s work. They need to make informed decisions about how to provide and how to respond to feedback. I wish there had been more time to go into some of the practicalities around this.

Jane Guiller of Glasgow Caledonian had students creating interactive learning resources using Xerte. They provide support for the use of Xerte and for issues such as copyright. These were peer assessed using a rubric. Students really appreciate demonstrating a deep understanding of a topic by creating something that is different to an essay. The approach also builds and demonstrates the students digital literacy skills. There was a mention at the end that the resources created are released as OERs.

Lucy Golden and Shona Robertson of the University of Dundee spoke about using on wordpress blogs in a distance learning course on teaching in FE. Learners were encouraged to keep a reflective blog on their progress; Lucy and Shona described how they encouraged (OK, required) the keeping of this blog through a five-step induction, and how they and the students provided feedback. These are challenges that I can relate to from  asking students on one of my own course to keep a reflective blog.

Jamie McDermott and Lori Stevenson of Glasgow Caledonian University presented on using rubrics in Grademark (on TurnItIn). The suggestion came from their learning technologist John Smith, who clearly deserves a bonus, who pointed out that they had access to this facility that would speed up marking and the provision of feedback and would help clarify the criteria for various grades. After Jamie used Grademark Rubrics successfully in one module they have been implemented across a programme. Lori described the thoroughness with which they had been developed, with drafting, feedback from other staff, feedback from students and reflection. A lot of effort, but all with collateral benefits of better coherency across the programme and better understanding  by the students of what was required of them

Each one of these four case studies contained something that I hope to use with my students.

The final plenary was Sally Jordan who teaches physics at the Open University talking about computer marked assessment. Sally demonstrated some of the features of the OU’s assessment system, for example the use of a computer algebra system to make sure that mathematically equivalent answers were marked appropriately (e.g. y  = (x +2)/2 and y = x/2 + 1 may both be correct). Also the use of text analysis to mark short textual answers, allowing for “it decreases” to be marked as partially right and “it halves” to be marked as fully correct when the model answer is “it decreases by 50%”.  This isn’t simple key word matching: you have to be able to distinguish between “kinetic energy converts to potential energy” and “potential energy converts to kinetic  energy” as right and entirely wrong, even though they have the same words in them. These are useful for testing a student’s conceptual understanding of physics, and can be placed “close to the learning activity” so that they provide feedback at the right time.

Here was the innovative automatic marking I had expected to be commonly used for appropriate subjects. But Sally also said that an analysis of computer marked assessments in Moodle showed that 75% of the questions were plain old multiple choice questions, and probably much as 90% were some variety of selection response question. These lack authenticity (no patient ever says “Doctor, I’ve got one of the following four things wrong with me…”)  and can be badly set so as to be guessable without previous knowledge. So why? Well, Sally had made clear that the OU is exceptional: huge numbers of students learning at a distance mean that there are fewer more cost effective options for marking and providing feedback,  even when a large amount of effort is required. The numbers of students also allowed for piloting of questions and the use of assessment analytics to sort out the most useful questions and feedback. For the rest of us, Sally suggested we could do two things:
A) run moocs, with peer marking and use machine learning to infer the rules for marking automatically, or
B) talk to each other. Share the load of developing questions, share the questions (make them editable for different contexts).

So, although I haven’t worked much in assessment, I ended up feeling on familiar ground, with an argument being made for one form of Open Education or another.

facebooktwittergoogle_plusredditpinterestlinkedinmail

Lawthorn primary school⤴

from @ Glow Gallery

We visited Lawthorn primary school to find out how they have been using Glow

Listen to teacher, Mr English, sharing what he has been doing and which of the Glow services he has been finding useful.

And now we have some pupils talking about what Glow has meant for them

Is self-assessment effective? by @Mroberts90Matt⤴

from

This post answers the 36th question from my TeacherToolkit Thinking page of Thunks. Thunk 36: Is self-assessment effective? by @Mroberts90Matt Is self-assessment effective? Lesson completed, objectives identified, learning activity tackled, five minutes to go. As a trainee teacher for some reason I always seemed to have the desire instilled in me to conduct a self-assessment opportunity. … Okumaya devam et

Using @IPEVO in the classroom by @TeacherToolkit⤴

from

As a result of using Twitter, I have developed some very important online relationships with companies that have an impact in the classroom. This particular blog, is about @IPEVO and the difference they have made to teaching and learning in classrooms throughout my school. This is a ‘thank you’ blog. Context: At Greig City Academy, … Okumaya devam et

How would you like to be observed? by @TeacherToolkit⤴

from

As Ofsted continue to face yet more challenges over the validity of lesson observations, I discuss how best we can develop as teachers and ask the reader, ‘how would you like to be observed?’ Context: An article I wrote was published in The Guardian on 27th May 2014. Disappointingly, I wrote this some time ago … Okumaya devam et

Peer-to-peer ‘fear or hear’? by @TeacherToolkit⤴

from

I attended my first ever school TeachMeet last year and during and after the event, the experience posed many, many questions to me as a school CPD leader. “critical peer reviews [academic papers] clearly cross the line between a vigorous critique and an unprofessional attack.” Context: My definition of a ‘school TeachMeet’ is this: A … Continue reading