Skip to Content

More on illegible fonts and learning

5 October, 2018 at 5:17 pm, by

I’m delighted to have heard back from Stephen Banham at RMIT who says that this research is in its infancy and that research papers will follow over the next six months or so. 

Tweeting about Sans Forgetica brought some precedents to light. In particular, Dominique Joseph alerted me to previous discussions in the plain language community and cited this paper, which itself contains quite a number of earlier references.

Diemand-Yauman, C., Oppenheimer, D., and Vaughan, E. (2011). Fortune favors the Bold (and the Italicized): Effects of disfluency on educational outcomes. Cognition, 118 (1), 111-115.

As the typography in the title shows (well done, Cognition, for letting its hair down on this occasion although you could have aligned the x-heights), this study also used less legible type to affect learning. Here’s a good summary and discussion of it:

And Gianni Ribeiro sent me a link to a critique of this team’s work by Meyer et al entitled ‘Disfluent Fonts Don’t Help People Solve Math Problems‘. 

My reading of the Sans Forgetica team’s press release is that they see the font change as (presumably) noticeable by the reader, but that the difficulty it causes affects them in a way that they are not actually aware of (that is, deeper cognitive processing takes place while they are struggling to read the text surface).

However, Diemand-Yauman and his colleagues seem to suggest that readers see the font change as an explicit signal that they need to interpret and potentially act upon

‘Importantly, disfluency can function as a cue that one may not have mastery over material (for a review, see Alter and Oppenheimer (2009)). For example, studies have shown that fluency is highly related to people’s confidence in their ability to later remember new information (e.g. Castel, McCabe, & Roediger, 2007). To the extent that a person is less confident in how well they have learned the material, they are likely to engage in more effortful and elaborative processing styles (Alter et al., 2007).’

So, assuming the RMIT finding is valid, we seem to have a debate about how the illegible font is acting on the reader – whether in a deep unarticulated way, or at the surface level of deliberate reader choices.

Meyer et al characterise it like this:

‘Many distinguish intuitive thoughts, released merely by exposure to stimuli, from reflective thoughts, occurring after deliberate deployment of additional operations (Shweder, 1977).’

They do an extensive review of the Alter et al (2007) study, which they report has been cited numerous times, including popular books by Malcolm Gladwell and Daniel Kahneman. Their paper is a detailed meta-review of a number of studies which try to replicate the effect, and a detailed critique of methodologies employed. Their conclusion is clear from the paper’s abstract:

Prior research suggests that reducing font clarity can cause people to consider printed information more carefully. The most famous demonstration showed that participants were more likely to solve counterintuitive math problems when they were printed in hard-to-read font. However, after pooling data from that experiment with 16 attempts to replicate it, we find no effect on solution rates. We examine potential moderating variables, including cognitive ability, presentation format, and experimental setting, but we find no evidence of a disfluent font benefit under any conditions. More generally, though disfluent fonts slightly increase response times, we find little evidence that they activate analytic reasoning.

In my view this is the level of review and critical thinking required before research results are released into the wild.

Ignoring for now this rather devastating demolition job, and going back to the Sans Forgetica and Diemand-Yauman findings… if disfluency is a signal to process the content differently, then it seems no different from other kinds of cues which do not involve illegibility or disfluency – for example, highlighting something in colour, or a teacher saying: ‘make sure you get this, because it’ll come up in the test’.

In fact, I can’t see how the font variation can go un-hypothesised by the reader, who is bound to ask themselves ‘why are these words in bold/italic/a crazy font’.

And, by the way, I italicised the word ‘can’ in the last sentence to emphasise it, not so you would remember just that word. Font changes already exist in our writing system, and have generally agreed functions.

In the world of instructional research, typography has occasional moments in the sun. This one reminds me of 1980s work on ‘typographic cueing’ (for example, Glynn 1978) which also used highlighting to signal important concepts. A problem with many such studies is that text is already visual as well as verbal. Typography and layout already exist and are used by readers to navigate documents and ideas, and to read actively and strategically.

And underlying all of this is an often unacknowledged debate about whether learners are passive sponges, soaking in knowledge (squeezed out later in a test, as proof of learning) or whether they are active participants in education.

Instructional designers have been debating this for many years. When I first worked alongside them at the Open University in the 1970s, educational psychology was in transition from behaviourism to cognitive theories. Behaviourists essentially saw humans as a sophisticated form of rat, whose responses to stimuli (rewards and punishments) could be studied and manipulated. They looked at observable behaviours, rather than speculate too much about invisible cognitive processes. In the educational context, this led to ‘programmed learning’ in which content was learned in tiny steps, with success rewarded along the way. With the exception of certain types of industrial training this led nowhere.

An influential researcher at the time was Ernst Rothkopf,* whose theory of ‘mathemagenic behaviours’ suggested that there are behaviours that give rise to learning, which can be induced or encouraged, even though not observed. Mathemagenic behaviours could be encouraged through, among other things, frequent inserted questions in text, which appeared to influence learning not only of the topics thus highlighted, but of other parts of the text also. Many, many studies were published on the topic.

I only mention it in order to introduce a famous review of the theory by Ronald Carver (1972) – a very readable and quite excoriating critique that’s a great introduction to instructional research at that time. I like his conclusion that ‘It appears it would be a questionable use of the practical decision-maker’s time for him to wade through this recent research since it is mainly irrelevant to most applied situations.’ Still applies…

It’s relevant to this current discussion for three reasons.

Firstly, Carver is very convinced that the speed at which students read, and the time they spend on text affects learning – so this supports the Sans Forgetica team’s view that slowing people down is a good thing. He cites a very early paper by Green (1931) who first pointed this out, and criticises numerous researchers for failing to control for this and report on it.

Secondly, he brings the learner’s own strategy into the foreground. He talks of ‘self-directed reading as a problem-solving process’ and of the reader’s ‘plan’ or ‘program’ being of primary interest, rather than speculation about observed behaviours.

Lastly, he points out that a statistical difference in a lab experiment is not in itself of value unless it relates to a theory (and I would extend this to: unless it survives in a practical environment). Without the theory, which enables generalisation, the result doesn’t matter.


Alter, A. L., & Oppenheimer, D. M. (2009). Uniting the tribes of fluency to form a metacognitive nation. Personality and Social Psychology Review, 13: 219–235.

Alter, A. L., Oppenheimer, D. M., Epley, N., & Eyre, R. (2007). Overcoming intuition: Metacognitive difficulty activates analytic reasoning. Journal of Experimental Psychology, 136: 569–576.

Carver, R.P. (1972) A critical review of mathemagenic behaviors and effect of questions upon the retention of prose materials. Journal of Reading Behavior, 4: 93-119.

Castel, A. D., McCabe, D. P., & Roediger, H. L. III, (2007). Illusions of competence and overestimation of associative memory for identical items: Evidence from judgments of learning. Psychonomic Bulletin and Review, 14: 107–111.

Diemand-Yauman, C., Oppenheimer, D., and Vaughan, E. (2011). Fortune favors the Bold (and the Italicized): Effects of disfluency on educational outcomes. Cognition, 118 (1), 111-115.

Glynn, S.M., (1978) Capturing readers’ attention by means of typographical cuing strategies. Educational Technology, 18 (11): 7-12.

Green, E.B. (1931) Effectiveness of various rates of silent reading of college students. Journal of Applied Psychology, 15: 214-227.
Meyer A., Frederick S., Burnham T.C., Guevara Pinto J.D., Boyer T.W., Ball L.J., Pennycook G., Ackerman R., Thompson V.A., Schuldt J.P. (2015). Disfluent fonts don’t help people solve math problems. Journal of Experimental Psychology: General, 144(2): 16-30

Rothkopf, E.Z. (1970) The concept of mathemagenic activities. Review of Educational. Research, 40: 325-336.

Shweder, R. A. (1977). Likeness and likelihood in everyday thought: Magical thinking in judgments about personality. Current Anthropology,
18, 637– 658.

*In case I’m misunderstood, my memories of Ernst (who I met at several conferences) were of an exceptionally warm, generous and approachable man who thought deeply about teaching and learning. I often quote one of his observations in support of the importance of layout – that people often remember things from the position on the page where they read them.

Rothkopf, E.Z. (1971) Incidental memory for the location of information in text, Journal of Verbal Learning & Verbal Behavior, 10, 608–613

Sans Forgetica… hmmm

4 October, 2018 at 3:19 pm, by

Newspapers the world over have featured a new font developed by RMIT, the Australian university, which claims to aid learning. It’s got a great name, Sans Forgetica, but its designer’s claim sounded preposterous enough for me to check it wasn’t published on 1 April.

Designed to be slightly illegible, it slows readers down, and the claimed effect is that they learn more, using a principle known as ‘desirable difficulty’.

According to the RMIT press release ‘Sans Forgetica has varying degrees of ‘distinctiveness’ built in that subvert many of the design principles normally associated with conventional typography. These degrees of distinctiveness cause readers to dwell longer on each word, giving the brain more time to engage in deeper cognitive processing, to enhance information retention.’

But surely the brain can’t engage in deeper cognitive processing if it’s wasting capacity on trying to even decipher the words.

Of course, I may be betraying my ignorance of cognitive theory here. Indeed, according to a report in the Guardian, ‘about 400 university students have been involved in a study that found a small increase in the amount participants remembered – 57% of text written in Sans Forgetica compared with 50% in a plain Arial’.

That’s impressive, and the newspaper articles and RMIT’s website stress the science behind Sans Forgetica – a combination of design principles and cognitive theory – so my first thought was to look a little more closely at the research.

But so far I haven’t managed to track down the science behind this. They haven’t rushed the launch, as there’s a series of well-produced videos, press releases and fonts to download, but there seems to be no link to a research report, and the publications listed in the relevant staff profiles on the RMIT website don’t include anything on this topic.

I’ve emailed to request it, and will review it when it arrives. But in the meantime here are some of my concerns.

I’ll be asking questions like: What exactly are the design principles? How statistically significant is the data? How did they test memory? How long were the texts the students had to remember? Were they told they would be tested? Did they ask them to reproduce the texts word for word? If not, how did they score their responses? Did the text require any processing or transformation of the content? How long was the gap between the stimulus and the test? Did they just compare font legibility, or did they also include other ways to slow readers down?

For this to be science, as the headlines claim, there needs to be a clear theoretical underpinning, peer review and enough information for others to be able to reproduce the study elsewhere.

Reading speed has been the classic measure of legibility for many years – so perhaps any sub-legible font would do the trick of slowing people down. 

There are loads of semi-legible fonts out there. How about Pointifax, inspired by early dot matrix printers? Or just print it in 8pt grey type like some design books I own.

And research also shows that accuracy declines along with speed. I found this in (unpublished) legibility research I did while working at the Open University in the early 80s, and Pat Wright published a classic study of proof-reading on screen vs paper which showed the same thing. In 1983 computer screens were pretty illegible (not far off the Pointifax sample above), and Pat showed that not only did proof-readers slow down when reading on screens, but they failed to spot as many errors.*

I can understand that it can be desirable to slow the reader down, or at least for the reader to slow themselves down – that’s at the heart of higher order reading skills. Skilled readers change pace, re-read passages, make notes, stop and think, but they do it in a self-aware, deliberate way using metacognitive skills.

But it appears the RMIT researchers are deliberately diverting the reader’s working memory away from grappling with content towards grappling with the font. This seems odd and counter intuitive, so I would have like to see some other conditions in the research – other strategies for slowing people down or encouraging metacognition and self-directed learning.

Classically these would include inserted test questions, activities or even just writing well… Or telling people to stop and think – the Open University in the 1970s used ‘student stoppers’ – bars across the page that signalled this was a good point to stop and reflect on what you’ve just read.

In fact, based admittedly only on its Wikipedia entry, ‘desirable difficulty’ as a teaching strategy appears to be much more akin to these techniques – test questions and flashcards are mentioned.

Encouragingly in one report (on the website, the designer, Stephen Banham, points out that the font is not meant for long passages of text:

‘It’s only meant for very small [sections of] text. It could be a small quotation or a particular line of text that a student wants to remember. The more sparingly you use it the greater the power it will continue to have.’

But the report goes on to say, ‘Banham added that the impact of the font is tested but not proven, and said he’s interested to see how it’s used and what life it has after it’s released to students.’

Tested but not proven… an interesting perspective, and not one conveyed by the press coverage.


*Wright & A. Lickorish(1983)Proof-reading texts on screen and paper, Behaviour & Information Technology, 2:3, 227-235, DOI: 10.1080/01449298308914479

Information Design Solutions and Problems: one day symposium

17 June, 2018 at 4:10 pm, by

On 9 September we’re holding a one day get together of information designers, just before the Summer School starts, in Bath. 

The Executive Board of the IIID will be in town for a meeting, and the summer school participants will be arriving. So we thought: why don’t we put them together in a room to make short presentations about the issues they face, and how information design can help.

One exciting thing about the summer school for me is that many participants are not trained as information designers – we’ve had lawyers, medics, technical writers, civil servants, statisticians and more. They’ve all spotted that information design can make a contribution to their work, and so this day will help reveal new problems which information designers can help to address. 

Anyone can come if we have room – just write us an email at

That consultation on small print: latest news (none)

30 March, 2018 at 5:18 pm, by

Today The Times reports:

“Tens of millions of pounds of taxpayers’ money has been squandered by the government on public consultations that have come to nothing, The Times can reveal.

Ministers have commissioned more than 1,600 since the Conservative Party’s victory at the 2015 election, an average of more than two every working day. More than 500, almost a third, have not been completed, with officials saying that they are still “analysing feedback”. These include at least 202 consultations started more than two years ago.”

I’m not surprised – nearly two years ago we took part in a government consultation on small print. See this post. In July last year I prodded the relevant department, who replied in the manner of those phone queue messages that reassure that ‘your call is important to us…’.

I can actually report a little more now, because I put in a Freedom of Information Act request, as follows: “It is now an exceptionally long time since responses to the call for evidence on Improving Terms and Conditions were collected in April 2016. I am therefore making a Freedom of Information request for: 

  • a list of the organisations and/individuals who responded to the consultation 
  • a copy of their responses 
  • your timetable for developing and publishing the Government’s response 
  • a draft of your response, if it exists yet.” 


I did get the first item, a list of responders (see below) but that’s about it. Fans of Yes Minister might appreciate the reason why I can’t see the timetable for the government’s response: “The department holds no information in scope of this part of your request.” Translation: we have no timetable.

I didn’t get the draft government response because of ‘the need to protect the policy development on which the policy formulation process is not complete’. Translation: well, you get the idea.

And I didn’t get the a copy of the responses on the grounds that, for data protection reasons, they would have to redact any personal data such as names and contact information of responders: 

“The Department considers that in accordance with Section 14(1) of the FOIA, it is not required to respond to this part of your request due to the disproportionate level of disruption to the Department’s mainstream activities that would be caused in seeking to comply. In considering the Section 14(1) exemption we have sought to weigh the purpose and value of the request to make this information available against the time and resources that would need to be diverted from other work to meet such a request. 

In relation to the burden on the Department in meeting the request, you should be aware that the Department received in excess of 500 responses to the Call for Evidence. All of these responses will contain the personal data of responders and some may also contain information which is confidential and commercially sensitive, the disclosure of which could prejudice the commercial interests of those who took part. If we were to release this information, every individual response would need to be considered against the exemptions in Section 40 (personal information), Section 41 (Information provided in confidence) and Section 43 (Commercial Interests) of the Act. Any responses released would then require redaction of any exempt information. We have therefore concluded that the burden in identifying and redacting exempted material across this volume of documentation would be disproportionate.” 

But I can report that the following organisations responded:

  • Association of Accounting Technicians 
  • O2 
  • University of Hertfordshire 
  • Sport and Recreation Alliance 
  • British Gas 
  • British Telecom 
  • Vodafone 
  • Lloyds 
  • UPS 
  • SCS Sofa Carpet Specialists 
  • Sky Scanner 
  • The Simplification Centre 
  • Sky UK 
  • EE 
  • Citizens Advice 
  • Money Saving Expert 
  • Trading Standards – Glasgow 
  • Association of Illustrators 
  • Citizens Advice Scotland 
  • Chartered Trading Standards Institute 
  • Institute of Consumer Affairs 
  • Which? 
  • Barclays 
  • Hargreaves Lansdown 
  • British Vehicle Retail & Leasing Association 
  • Residential Landlords Association 
  • Home Retail Group 
  • JFR 
  • Association of British Introduction Agencies 
  • Bongo A Go Go Campervan Hire 
  • CSM Sport & Entertainment 
  • Sheffield Window Centre 
  • Banquet Chocolates Ltd 
  • Money Advice Service 
  • News Media Association 
  • BCS The Chartered Institute for IT 
  • Rosemary Bookkeeping 
  • AXA UK 
  • British Holiday & Home Parks Association 
  • EDF Energy 
  • ABTA 
  • Association for UK Interactive Entertainment 
  • Finance & Leasing Association 
  • Association of Accounting Technicians 
  • Telefonica UK Ltd 
  • OFCOM 
  • Tech UK 
  • Consumer Credit Association 
  • Council of Mortgage Lenders 
  • British Parking Association 
  • University of Hertfordshire Annex A 
  • Information Commissioner’s Office 
  • The Bar Council 
  • Scottish Power 
  • RWE npower 
  • Microsoft Ireland Operations Ltd 
  • British Banking Association 
  • England and Wales Cricket Board 
  • Association of British Credit Union Ltd 
  • Competition Markets Authority 
  • Financial Conduct Authority 
  • Radio Centre 
  • Mydex Data Services 
  • Forum of Private Business 
  • Safetosign 
  • Direct Line Group 
  • MBNA Ltd 
  • Civil Aviation Authority 
  • Rugby Football Union 
  • Luton Borough Council Trading Standards 
  • Building Societies Association 
  • Association of British Insurers 
  • Ombudsman Services 
  • Communisis Group Solutions 
  • Birmingham City Council 
  • Solicitors Regulation Authority 
  • British Horseracing Authority 
  • Hargreaves Lansdown 
  • Institute of Chartered Accountants in England and Wales 

2018 Summer School announced

24 February, 2018 at 11:00 pm, by

We’re very excited to announce the next summer school, which will be 10-14 September 2018. This year it’s at the University of Bath in the architecture department. This is the UK’s top-rated architecture school and they have a great new teaching space with numerous break-out spaces for small groups to work together. 

Another innovation this year is the advanced stream, for people who’ve been before. Quite a few have requested it (please come now we’ve organised it!). The programme is still flexible and we’ll listen to any thoughts you have. There will be seminars where we can discuss case studies, including your own work and ideas, and we’d like the project to be something significant we can publish or present at a conference.