Newspapers the world over have featured a new font developed by RMIT, the Australian university, which claims to aid learning. It’s got a great name, Sans Forgetica, but its designer’s claim sounded preposterous enough for me to check it wasn’t published on 1 April.
Designed to be slightly illegible, it slows readers down, and the claimed effect is that they learn more, using a principle known as ‘desirable difficulty’.
According to the RMIT press release ‘Sans Forgetica has varying degrees of ‘distinctiveness’ built in that subvert many of the design principles normally associated with conventional typography. These degrees of distinctiveness cause readers to dwell longer on each word, giving the brain more time to engage in deeper cognitive processing, to enhance information retention.’
But surely the brain can’t engage in deeper cognitive processing if it’s wasting capacity on trying to even decipher the words.
Of course, I may be betraying my ignorance of cognitive theory here. Indeed, according to a report in the Guardian, ‘about 400 university students have been involved in a study that found a small increase in the amount participants remembered – 57% of text written in Sans Forgetica compared with 50% in a plain Arial’.
That’s impressive, and the newspaper articles and RMIT’s website stress the science behind Sans Forgetica – a combination of design principles and cognitive theory – so my first thought was to look a little more closely at the research.
But so far I haven’t managed to track down the science behind this. They haven’t rushed the launch, as there’s a series of well-produced videos, press releases and fonts to download, but there seems to be no link to a research report, and the publications listed in the relevant staff profiles on the RMIT website don’t include anything on this topic.
I’ve emailed to request it, and will review it when it arrives. But in the meantime here are some of my concerns.
I’ll be asking questions like: What exactly are the design principles? How statistically significant is the data? How did they test memory? How long were the texts the students had to remember? Were they told they would be tested? Did they ask them to reproduce the texts word for word? If not, how did they score their responses? Did the text require any processing or transformation of the content? How long was the gap between the stimulus and the test? Did they just compare font legibility, or did they also include other ways to slow readers down?
For this to be science, as the headlines claim, there needs to be a clear theoretical underpinning, peer review and enough information for others to be able to reproduce the study elsewhere.
Reading speed has been the classic measure of legibility for many years – so perhaps any sub-legible font would do the trick of slowing people down.
There are loads of semi-legible fonts out there. How about Pointifax, inspired by early dot matrix printers? Or just print it in 8pt grey type like some design books I own.
And research also shows that accuracy declines along with speed. I found this in (unpublished) legibility research I did while working at the Open University in the early 80s, and Pat Wright published a classic study of proof-reading on screen vs paper which showed the same thing. In 1983 computer screens were pretty illegible (not far off the Pointifax sample above), and Pat showed that not only did proof-readers slow down when reading on screens, but they failed to spot as many errors.*
I can understand that it can be desirable to slow the reader down, or at least for the reader to slow themselves down – that’s at the heart of higher order reading skills. Skilled readers change pace, re-read passages, make notes, stop and think, but they do it in a self-aware, deliberate way using metacognitive skills.
But it appears the RMIT researchers are deliberately diverting the reader’s working memory away from grappling with content towards grappling with the font. This seems odd and counter intuitive, so I would have like to see some other conditions in the research – other strategies for slowing people down or encouraging metacognition and self-directed learning.
Classically these would include inserted test questions, activities or even just writing well… Or telling people to stop and think – the Open University in the 1970s used ‘student stoppers’ – bars across the page that signalled this was a good point to stop and reflect on what you’ve just read.
In fact, based admittedly only on its Wikipedia entry, ‘desirable difficulty’ as a teaching strategy appears to be much more akin to these techniques – test questions and flashcards are mentioned.
Encouragingly in one report (on the website www.educationreview.com.au), the designer, Stephen Banham, points out that the font is not meant for long passages of text:
‘It’s only meant for very small [sections of] text. It could be a small quotation or a particular line of text that a student wants to remember. The more sparingly you use it the greater the power it will continue to have.’
But the report goes on to say, ‘Banham added that the impact of the font is tested but not proven, and said he’s interested to see how it’s used and what life it has after it’s released to students.’
Tested but not proven… an interesting perspective, and not one conveyed by the press coverage.
*Wright & A. Lickorish(1983)Proof-reading texts on screen and paper, Behaviour & Information Technology, 2:3, 227-235, DOI: 10.1080/01449298308914479