Tag Archives: CPD

Introduction to Text Analysis⤴


I booked on to the CDCS Introduction to Text Analysis 2-hour course and installed nltk in preparation. This 2-hour “silent disco” online event is an intermediate Python course on using the NLTK package.

Introduction to Text Analysis

The workshop uses Notable but I ran the notebook in Visual Studio (because it’s a good IDE). The notebook is written in markdown text with code blocks interspersed in “cells” throughout the document.

The workshop begins with some exercises in the IDE to get used to running python code within the cells to do something with text strings, using variables, e.g.

first_name = "ada"
last_name = "lovelace"

full_name = first_name + ' ' + last_name # combine strings with a space inbetween

print (full_name)

File handling

Basic file actions are rehearsed including opening and reading a file, and using text manipulation functions:

file = open("origin-intro.txt")

txt = file.read()
txt = txt.lower()


The Python package NLTK provides a set of natural languages algorithms e.g. tokenizing, part-of-speech tagging, stemming, sentiment analysis, topic segmentation, and named entity recognition. We played with some of these using the text extracts we had been given.


Text analysis begins with breaking down a block of text into smaller chunks such as words or sentences. This is called Tokenization.

import nltk
from nltk.tokenize import sent_tokenize # or word_tokenize for words

f = open("origin-intro.txt") # open file

txt = f.read()          # add file contents to variable



Cleaning text

The initial part of this process in preparing for analysis is:

  • Open the file and assign it to a variable
  • Convert to lower case
  • Split into tokens
  • remove stopwords

“Stop” words are sometimes called filler words: they are the common words that don’t add much meaning to a sentence.

import re
from nltk.tokenize import word_tokenize

f = open("origin-intro.txt") # open file
contents = f.read()          # add file contents to variable
contents = contents.lower()  # lower case text
contents = re.sub(r'[^\w\s]','',contents)  #remove punctuation (note use of regex)


for w in tokenized_word:
    if w not in stop_words:
print("Filterd Words:",filtered_word)

Further functions and libraries

Working through the notebook, running code and downloading libraries as required, I was able to easily clean blocks of text; tokenize and count words; plot frequency distributions; tag parts of speech; identify common bigrams and n-grams (word pairs and n-word groups).

Fun with Trump

A delightful bit of fun was had analysing President Trump’s Tweets using this code to fetch the text:

import csv
import urllib.request

url = 'https://learn.edina.ac.uk/intro-ta/files/trump-tweet-archive.csv' # download the file
csv = urllib.request.urlopen(url).read() # assign the contents of the file to a variable (csv)
with open('tweets.csv', 'wb') as file: # create a new file and save the contents of 'csv' to this file
    print('CSV file created')

What do you think was the top 4-gram in all of this data?

make america great again! 390

Of course it was.


This was a very nice way to run through a tutorial notebook and learn a few tricks in python for text analysis, with support running in the background if I needed it. The format is powerful; it focuses you on the task within the time window allocated and helps you avoid distractions. Although it’s self-study (and as such relies on good quality materials in the first place), it’s OK to play a bit too, because there’s help at hand if you come unstuck. Well done, CDCS.

LibSmart I⤴


This introductory course provided by the University of Edinburgh’s library is structured in five modules.

Module 1: Getting Started with the Library

Accessing and using online library resources for your studies. This begins with the task of setting up a “Pebble Pad” blog (the reason for which isn’t apparent), which itself begins with a series of “helpful” videos.

I find video an extremely frustrating and patronising format and will always prefer to read a few well-chosen words that cut to the chase. Almost every instructional video I have ever seen seems to have been designed to meet a requirement to have video as part of the instruction tools. I have never seen one that was better than words on a page. I include the videos I have made in that.

How to access online library resources provides basic access information including how to set up the VPN, and how to navigate the menus in the MyEd university landing pages to find library search tools like DiscoverEd. The video for this section is available as a pdf: it’s as if someone’s listening! Understanding and Accessing Your Reading List isn’t at first sight relevant to PhD – because there are no reading lists for PhD – but it’s possible to look in on other courses or browse the Resource Lists. Start Searching DiscoverEd offers some practice with the university library search interface as well as again providing stats on how much stuff there is in the library. The module finishes with Making the Most of Library Resources which shows how to find help.

It’s been helpful to return to basics in this module and check that I’m not missing anything fundamentally useful in accessing the library. I have found the library interfaces generally quite self-explanatory and intuitive to use, especially if you take the time to read the interface carefully.


  • Study Skills Guides - a resource list for all students, covering general study skills, academic writing, referencing, critical thinking, reflective writing, literature reviews and so on.

Module 2: Your Information Landscape

Explore and use key library resources appropriate to your discipline. Useful areas here include the subject guide for Education and ITE and a video on literature search.

Modules 3 – 5

These address finding and retrieval; managing information and referencing. I went through these quickly to see what’s new or useful. From these…

Identifying source categories

Information sources can be broadly categorised into three types:

  • Primary or original sources
  • Secondary sources provide interpretation, commentary or analysis
  • Tertiary or reference sources are dictionaries, encyclopaedias or indexes of primary and secondary information

Evaluating information

A very useful stick-on-the-wall guide to selecting literature was published by the Meriam Library at California State University (Evaluating Information – Applying the CRAAP Test, 2015).


  1. Meriam Library - California State University Chico. (2015). Evaluating Information – Applying the CRAAP Test. 17. https://www.csuchico.edu/lins/handouts/eval_websites.pdf

Four Photos, Four Edits⤴


Capture One is my preferred choice of image editor, for its support programme as much as its functionality. This morning, I attended one of their many excellent editing seminars, to catch up with the current software and make sure I’m not missing any tricks, along with delegates from around the world.

The seminar was presented by David Grover, Phase One’s Global Manager of Training. He took four images, one at a time, and talked through his workflow.

David starts with composition and framing, cropping and adjusting keystone if required to fine tune the lines within the image. He moves on to adjust exposure or brightness, depending on existing brighter areas within the image. Brightness works only on mid-tones, whereas exposure is a global adjustment. Moving on to the levels histogram reveals any flatness in the image which can be adjusted by pulling in the range of tones. Again, shadows and blacks work on different ranges within the image, with blacks more narrow than the shadows adjustment. These can be selectively lifted to improve contrast.

With one image, which had a brighter side than the other, David applied a graduated mask, adjusting the fall-off to limit the extent, and reducing exposure within the masked area. Further masks were used to create vignetting. Each mask is applied in its own separate, named layer. Named, so you can remember what each layer does!

David Grover demonstrates the use of layers

Style brushes, a new feature in Capture One Pro 21, were demonstrated also: I have found these to be a very quick and powerful way to make improvements in an image, especially in lifting localised highlights for impact. This is one of the tricks used by landscape photographers to make those stunning shots you wish you could take. A landscape image was edited following a similar process to the first, urban, image, with the addition of some colour toning. David brushed in a clarity (mid-range contrast) adjustment in the sea using a filled layer to first adjust the effect, then clearing the mask before brushing it back in to the area of interest. Colour balance was achieved using a new layer, which technique allows for comparison, which stops you going too far with the adjustment. The settings for this adjustment were saved as a custom style brush, making it available for use on any other image.

The use of heal brushes to remove unwanted clutter on a beach was demonstrated and in response to Q&A, David led a nice discussion on the difference between luma (stable colour) and RGB (impacts colour) curves and their combination in the contrast slider in Capture One.

Noise reduction was demonstrated in an image of a flower, along with sharpening of the flower independently of the background using the colour editor (and preview) to make a new mask – “a bit of a hidden feature”, according to David. This is an extremely powerful tool and something I did not know about before. The refinement of the mask is also very smart.

The final image in the seminar was another landscape. A similar workflow was followed as for the other images, with more focus on the colours in the image, using style brushes to warm up the foreground and lift shadows in a more distant rock formation. Contrast was improved using the custom style brush created in the previous image edit. A final saturation lift completed the final edit.

Further tutorial resources are available (and worth following up) in the Capture One YouTube channel.


“Red Campion” image © Nick Hood 2021.
Original “India” image in the screenshot © Emily Teague.

Reading iCal in R⤴


Being organised is an important habit for anyone in these days of scope creep – the tendency for more and more to be done as part of the job. We’re all trying to maximise our capacity, so eliminating duplication of effort is one way to avoid wasting time doing unnecessary admin. Productivity tools like email and calendars have replaced the memo and diary of pre-Internet days, but there are many brands and infrastructures, often competing with each other. The result can be that we end up keeping several email accounts, and several calendars with the inevitable double booking and confusion.

My policy is wherever possible to keep one master data source: documents are configuration managed and stored safely, checked out and checked in when updated and with a visible, reversible change history. Calendars for each project are aggregated in a suitable viewer from master files in iCal format, allowing them to be easily shared and syndicated.

Planning for next academic year, I wanted to display a simple GANTT chart for students of the overall course structure. This, because I had previously been duplicating weekly details from the master calendar. I wanted a way to automatically generate mini-GANTTs for each week in the Virtual Learning Environment (VLE) from the course calendar. Here’s how I did it: the VLE is written and published in Bookdown.

Fortunately, there is already a package for reading and manipulating iCal files. You may need to install this first.

> install.packages("calendar")

So, firstly we want to grab data from the iCal feed. This is the path to an .ics file or the ical data for the calendar: make sure it’s not just a link to a web interface for the calendar.

> mydat <- readLines("https://www.gov.uk/bank-holidays/england-and-wales.ics")

It’s worth checking that this has returned something useful: the head() function returns the first few lines and an iCal file should look something like this:

> head(mydat)
[1] "BEGIN:VCALENDAR"                     
[2] "VERSION:2.0"                         
[3] "METHOD:PUBLISH"                      
[4] "PRODID:-//uk.gov/GOVUK calendars//EN"
[5] "CALSCALE:GREGORIAN"                  
[6] "BEGIN:VEVENT"       

We can use ic_dataframe() to organise this flat file into something more structured, again peeking in at the first few column header names in the data frame:

> mydf <- ic_dataframe(mydat)
> head(names(mydf))
[4] "UID"                "SEQUENCE"           "DTSTAMP"                          

Selecting the information you need from that is a matter of applying filters. A new data frame using the first 3 columns:

> set1 <- data.frame(mydf["DTSTART;VALUE=DATE"],mydf["SUMMARY"])
> head(set1)
1         2016-01-01         New Years Day
2         2016-03-25            Good Friday
3         2016-03-28          Easter Monday
4         2016-05-02 Early May bank holiday
5         2016-05-30    Spring bank holiday
6         2016-08-29    Summer bank holiday

Selecting only Jubilee holidays using the logical form of grep:

> set2 <- subset(set1,grepl("Jubilee",set1$SUMMARY))
> head(set2)
   DTSTART.VALUE.DATE                       SUMMARY
54         2022-06-03 Platinum Jubilee bank holiday

Adding new headings using setnames from the data.table library, then removing row numbers and displaying as a table.

setnames(set1, c("Date","Event name"))
set2 <- subset(set1,grepl("2022",set1$Date))
knitr::kable(set2[order(set2$From),], caption = '2022 Holiday Calendar', row.names = FALSE)

Which will yield a table in your book(down):

Table 1: 2022 Holiday Calendar

Date Holiday
2022-01-03 New Year’s Day
2022-04-15 Good Friday
2022-04-18 Easter Monday
2022-05-02 Early May bank holiday
2022-06-02 Spring bank holiday
2022-06-03 Platinum Jubilee bank holiday
2022-08-29 Summer bank holiday
2022-12-26 Boxing Day
2022-12-27 Christmas Day


I now have a way of automatically updating the calendar in the VLE for my course, my only having to rebuild the site after a change in the master calendar. This is hugely useful within my workflow and reduces the risk of redundancy or error when there is more than one master. Next steps are to make this produce a GANTT chart.

Mental Health Training⤴


As a personal tutor with pastoral as well as academic responsibility for a group of students, I took advantage of an online CPD session on mental health run by colleagues form the University disability office and counselling service. This was principally to refresh my skills and understanding but as always with these things, there were new things we were challenged to think about.

The mental health continuum

One of these was the mental health continuum, a helpful way of thinking about how people may have a range of mental health from good to bad, yet also be somewhere between ill and not ill: it is possible for people to be ill and yet who seem to be coping well with it: perhaps by using strategies or medication to help them manage and function. Equally, people who are not ill may be suffering with stress or as a consequence of life events. I find this little device (my quick sketch above) great to normalise the range of health people may have and to help keep this idea to the fore when talking to tutees in future.

We were reminded of the range of support services available in the University for students: I have experience of these from several years of supporting students on the very intensive experience that PGDE can be, some of whom have brought with them their own stories and experiences and who have needed continuing support. This has not always been optimal in my experience, where specialist services are required but I do know that in the main, my colleagues in places like the Chaplaincy and the disability service have done and continue to do, superb work on behalf of the students. Where the resources of the University are limited, we were advised that things like private counselling may be available on the students’ parents’ health insurance, for example.

Where the limits of support are reached, it can be easy to characterise the support offered as being part of “the illusion of inclusion” (cf. “ethics washing”, see this post for details). This may be cynical: the University is rich but it wouldn’t last long if every student drew too heavily on its resources. I am put in mind of the old caveat1:

“… in serving a friend or Brother in time of need, without detriment to ourselves or connections.”

As always, we do our best, whether those we serve realise it or not. As part of that, our skills need to be at least good enough not to make things worse: so, this course included reminders about using non-verbal skills of engagement and listening; of not reacting; of respecting boundaries. The advice included lessons from the past year, paying attention, for example, when talking to students online to look at the camera, not their eyes on your screen. I came to this awareness early last year because I often use two screens, and situate the conference window on the display that has the camera. It is easy to let the tech fool you into a misalignment with the other person’s reality.

The session overran, unfortunately, which meant that a number of us had to bail before it ended, but there was time to engage with a couple of (fictionalised) case studies, which was helpful. It was valuable, nonetheless, and will hopefully help me support my continuing students and the new cohort arriving in August.


  1. From the Working Tools of the Entered Apprentice, Emulation Lodge of Improvement. (2015). Emulation Ritual (13th ed.). Lewis Masonic. 

How to do a literature review⤴


Getting an effective literature review down is important for several reasons, not least of which it will immerse you in the topic or field you are looking at and give you a decent grip on what is known about it, and what isn’t. This is enough justification for doing a literature review – to become an expert on a subject – but there are other motives, such as being able to convince others that you know what you’re talking about; that your proposed research or study is worth investing in; that you are serious and committed to a proposed project; that there are new frontiers yet to explore.


For social sciences, literature reviews are often associated with new research to show how and where it fits in with what has been done before: they locate the research within a field of study, providing context, and identifying areas that need to be strengthened or filled.

How they work

To get started with your own literature review, find 4 to 6 literature reviews and look at what they look like. Deconstruct them to see how they work: a literature review should identify a research question that provides the central topic of interest; it should provide the “hook” that explains why the research is relevant, interesting, timely or important; it should cite important people who are writing in the field; as the author of the review establishes authority, they note the gaps in the literature that the proposed new research addresses. When writing about what the literature says, it’s not necessary to write everything that is known – a good literature review will keep tightly focused on the research question (and within the word count).

Getting started

A sketch or visual representation can help organise thinking and develop understanding when writing a literature review. Steps to this are:

  • find the literature (e.g. Google Scholar), noting how well cited they are
  • screen (scan) about 20 articles to check they are on the topic.
  • within those articles, look for themes – of agreement or disagreement, cultures, or contexts

Take a large piece of paper and draw what you have found – a hierarchy, spider diagram, or anything visual that helps you make connections, connect authors to themes – show gaps in a different shape. Now look for what’s missing in the chart – step back to do this, and use your common sense to see the overall shape, structure and connections in the literature as well as the gaps.

The written literature review is a description of the chart that will make way to introduce your research proposal. Beyond this initial review, going deeper will require more finding, and reading, of papers, books and articles.

A good literature search is systematic, starting with background reading: from the question, title or broad theme, do some background reading to get a good grasp of the theories and concepts in the topic. Use text books and encyclopaedia to get under way. From this, work up feasible draft titles and identify the search terms and their synonyms. For each key term, list alternative terms and related terms to help your search.

Now you are ready to try your resources for finding literature: many universities have unified search tools such as DiscoverEd but you will need to find more specific sources relevant to your field. Your library will have a databases A-Z or listed by topic which will help you make a list of resources to search for literature.

You will need to develop a good technique for asking the search engine in a way that yields helpful results. Logical combination of terms (like AND, OR) and using wild cards are required, and will be determined by the search tool you are using. Read the help pages and learn how to work the tool properly before spending hours with it.

Finally, as you find papers and articles to read, try to organise them logically: identify and prioritise the important papers (e.g. those that everyone else cites); group them by sub-topic or theme; push peripheral papers aside until you need to draw them into your research or thesis. A good reference manager like Mendeley is invaluable for keeping track and organising what you find. It then makes the task citing and referencing extremely straightforward.

Reading and writing

Keep notes of the key points of the papers you are reading, and keep them organised. Try in your notes to capture “thesisable prose” that you can use easily in your essay or thesis. Keep in mind the structure of your review: is it time-based, or thematic? Organise your notes in the same way, making connections. Remember your drawing if it helps you sustain a vision of the structure and relationships.

When writing, try to avoid being descriptive: your authorial voice ought to be heard in the discussion as you build it, with a clear view of the evidence that has given you your stance. Also, prune out anything irrelevant or superfluous to your research question.


Keep your research question clearly in view when approaching and completing the literature review. Make a visual representation of relevant themes and their structure. Use that to structure the review and develop it as you read. Prune, check for your stance, and the evidence for it.


As well as drawing on my own experience in initial teacher education, and information from the IAD at the University of Edinburgh, I have also used the following in writing this post.

White, C. (2018). How to Conduct an Effective Literature Review. In SAGE Research Methods Video: Practical Research and Academic Skills. https://doi.org/10.4135/9781526442734

Faculty Librarians. (2012). Doing a literature search : a step by step guide. March, 1–21.

The Problem of Educational Theory⤴


This was an online seminar offered by my own Institute, presented by Stefan Siegel, a doctoral student from Augsberg. The seminar was introduced by Professor Gert Biesta from ETL and attended by delegates from Edinburgh and overseas.

Stefan began with a quick survey of delegates to get the feel of how we feel about educational theory: is it a distinct discipline in its own right, or is it interdisciplinary, and challenged us to name an educational theory. I feel that educational theory is clearly distinct from others, and not a sub-field of (say) psychology. He was able to immediately share the analysis of the audience responses. He went on to discuss his research and thesis before setting out the agenda.

Theorising education

Development of educational theory has proceeded differently in the Anglo-American context than “Continental”. In the former, it is viewed as interdisciplinary, whereas in the latter, it is more a field in its own right, referred to as Pädagogik, established in the 1920s in Germany. Its roots being in philosophy, it evolved to make use of more quantitative methods. Problems discussed focused initially on the certainty of definition in German educational theory. Stefan’s narrative went on to examine terms, including theory and education, and discussed the challenge of defining these things. Stefan made frequent reference to Biesta (2013) – other works, of course, but this popped up quite a lot.

The talk progressed towards a definition of educational theory and considered the strengths and shortcomings of defining educational theories according to how narrow these definitions are.


A rich conversation developed in considering the questions that help to define a field theory or discipline. The consideration of the terms used on the continent was helpful for me, including terms that combine more than one, such as teaching and learning, in German Lehren und Lernen. Perhaps it is just a semantic point and there’s no difference other than hearing one term when two are said.


The problem with language is that it doesn’t belong to anyone, so it can be used and abused by everyone with impunity. Within education, in my experience, this results in the hijacking of common terms for new purpose: the first time I came across this was in my probationary year when I had to endure a talk on enterprise education, which I argued at the time, had absolutely nothing to do with enterprise the way I understood it from over 20 years in commerce and industry. There are plenty of others: inclusion, for example, ultimately even education, teaching and learning. See Biesta (2005) for a discussion on the latter.

What is clearer to me now is the understanding that a theory is inextricably linked to the questions it attempts to answer. The terms of the question are where the focus and clarity are required in their definition in order to make sense of what the theory means.


Biesta, G. (2013) ‘Giving Teaching Back to Education: Responding to the Disappearance of the Teacher’, Phenomenology & Practice, vol. 6, no. 2, pp. 35–49.

Biesta, G. (2005) ‘Against learning. Reclaiming a language for education in an age of learning.’, Nordic Studies in Education = Nordisk Pedagogik, Nordic Educational Research Association (NERA), vol. 25, no. 1, pp. 54–66 [Online]. DOI: 10.1177/00187267030568002.

Data Ethics, AI and Responsible Innovation⤴


In November, I took part in 4 weeks of a 5-week MOOC offered by the University of Edinburgh via the edX platform, Data Ethics, AI and Responsible Innovation. I had various difficulties with the course itself, culminating in a barrier to my continuing.

You can read my notes on the course, including a personal reflection, at the blog I kept on the MOOC.

Climate Change Mitigating Technologies⤴


I had the chance to sit in on the cross-party group on science, in which there were two presentations on the topic, the first from Rebecca Bell, Scottish CCS1 on Carbon Capture and Storage. The second was given by Richard Gow, Drax2 on Bioenergy with Carbon Capture and Storage. The latter presentation called for policy help in rewarding negative carbon emissions, which are an odd omission from the accounting model used in climate change impact measurement.

Both provided a really useful understanding and overview of what carbon emission and capture is about and how it is working, with an emphasis on what is happening in Scotland within a very clear European context. I found the presentations, both neither slick nor sales-focused, extremely engaging and helpful in thinking about CO\(_2\) emissions.

There was a lively and wide-ranging Q & A session chaired by Craig Denham of the RSE. Questions were both technical and social: there was good representation of young people through, for example, asking about the skills required to find careers in CCS. My own question:

For teachers, are there any behaviours they can model for young people that will enable them to take a specific personal responsibility for action in tackling CO\(_2\) accrual in the atmosphere?

I suspect this was a question outside of the scope of the presentations (focusing on individual action) but it was picked up by Craig, which I am thankful for. Richard picked this up first and acknowleged the criticism of BECCS for being remote from personal action but pushed back against this by linking to personal choices such as taking less flights. Rebecca added to that by pointing to transport choices like taking your bike, or wearing a jumper instead of turning up the heating, which are easily modelled and reinforced by educators. She also pointed to SCCS resources related to CfE, and the LfS Scotland resources. I particularly liked the GeoBus Education Resources site which is designed to provide teachers with an introduction to CCS, providing experiments, activities, lessons and homework ideas as well as links to a number of other useful CCS education resources, which are linked to English Key Stage 3 and Scotland’s CfE: this pdf links the resource to the Experiences and Outcomes.

The resources available in the websites of both organisions are very accessible and immediately useful in schools in, for example, projects within the interdisciplinary topic of sustainable energy production. It is particularly warming to see the interest and promotion of positive problem solving through the cross-party group. I am thankful to them for opening up this session to interested parties and applaud the the work being done by SCCS and Drax.


The header image is part of an infographic available at SCCS.

  1. Scottish CCS is “a partnership of the British Geological Survey, Heriot-Watt University, the University of Aberdeen, the University of Edinburgh and the University of Strathclyde working together with universities across Scotland.” 

  2. This is the group that operates Drax Power Station which is moving from coal-fired to biomass and leads on innovation and development in the technologies of Bioenergy with Carbon Capture and Storage (BECCS). 

IOP CLPL: Smartphone practicals⤴

from @ @cullaloe | Tech, tales and imagery

The IoP in Scotland is putting on a rich catalogue of online CLPL for the community of physics teachers, and presented by colleagues from that same community. I accessed one of these after the event, because it had been recorded and published by Drew Burrett on YouTube.

This session was hosted by Stuart Farmer and Jenny Hargreaves, and presented by Murat Gullen and Martyn Crawshaw, who gave us a practical introduction to using tools for data capture and analysis using the suite of sensors on most modern phones. They presented a brief rationale, acknowledging that although not all children have a smart phone, most have access to one. The scope (pun intended) for teachers to offer practical sessions using their own equipment was also underlined.

The first tool discussed was PhyPhox which not only can access all of the phone’s sensors, it has a number of built-in activities and tools to make use of the data captured. These can be exported in several formats, or posted to Dropbox for later discussion. What I hadn’t realised is that there is a built-in web server accessed through the “triple dots” at the top right of the phone screen which enables display and control of the app from any nearby browser, provided that the computer is on the same network (it uses a 192.168.x.x IP address).

Screenshots of Phyphox on iPhone and Browser (uses the language of the phone)

Martyn started by talking about Vernier video tracking and analysis software, similar to the java Tracker program. He went on to demonstrate Pasco’s SparkVue app, what Martyn called the “twenty-first century version” of Data Studio found in many schools. It allows connection to onboard sensors and also Pasco equipment in the lab.

It was made clear that teachers should take care not to assume that their pupils are tech-savvy enough to know how to use all of the tools and interfaces without support and guidance.

“Don’t lose the learning!” – Martyn Crawshaw

Excellent stuff, as ever, with a lively Q & A at the end. I’m going to share this with my students.