Comprehension testing


How do you successfully usability test content?

Usability testing content is just as important as usability testing designs. We do this to ensure that our content is meaningful, makes sense to our users, and gets them where they need to go. When we decided to usability test content, we realized we didn't know where to start necessarily. It is not done often. The methods we looked into were those of comprehension testing, which included:

We decided to go with a conglomeration of these. Our goal was to validate the idea that plain language for federal job announcements will perform better than job announcements not written in plain language (seems obvious, I know, but trust me: it needed to be quantitatively validated).

Here is the test script we tested with.

Here is what we showed our users - first, we let them read the actual version of the text, before it was edited with plain language. Then, we did the Cloze test, by skipping every fifth or sixth word and seeing how many that users would be able to fill in correctly:

Screen Shot 2017-09-04 at 8.33.36 PM.png

Then, we asked them multiple choice questions about the text they had just read as well, to see how well they understood it:

Then we conducted the same thing over the second version of the text, that we re-wrote in plain language:

Screen Shot 2017-09-04 at 8.40.04 PM.png

... and we asked them multiple choice questions about it too. Again, our goal was to quantitatively prove that plain language was more user-friendly than government legal jargon.

Here's what we found:

Comprehension test-1.jpg
Comprehension test-2.jpg
Comprehension test-3.jpg

Want to talk through this some more? Contact me here!