How To Test And Measure Content In UX?
The goal of the content design is to reduce confusion and improve clarity. Yet often it’s difficult to pin point a problem as user feedback tends to be not specific enough. But: we can use a few simple techniques to assess how users understand and perceive content. Let’s take a look.
Quick housekeeping note: we’re about to launch a new UX workshops tour with stops in Austria, Germany, UK, US, Switzerland, Czech Republic, Poland — but also online of course. With practical workshops on accessibility, designing AI products and design patterns.
The next stop is a new online cohort on Smart Interface Design Patterns 🍣 next Friday, on March 7 — with an all-around UX deep-dives on complex interfaces, accessibility, forms, tables, charts and dashboards. Get last tickets (15% off with coupon LINKEDIN).
And if you’d like to host a workshop in your venue, or have your team joining in, please get in touch — my email is vitaly[@]smashingconf.com.
As designers, we need to address doubts and concerns before they even happen. Perhaps even without FAQ.
Content testing is a simple way to test clarity and understanding of the content on a page — be it a paragraph of text, a user flow, a dashboard, or anything in-between. Our goal is to understand how well users actually perceive the content that we present to them.
It’s not only about being confused or not finding the right answer on a page, but also if our content clearly and precisely articulates what we actually want to communicate.
🍌 Banana Testing
Banana testing: replace all key actions with the word “Banana”, then ask users to suggest what it could be.
A great way to test how well your design matches user’s mental model is the Banana testing. We replace all key actions with the word “Banana”, then ask users to suggest what each action could prompt.
Not only does it tell you if key actions are understood immediately and if they are in the right place, but also if your icons are helpful, and if interactive elements such as links or buttons are perceived as such.
Content Heatmapping
Content heatmapping
One reliable technique to assess content is content heatmapping. The way we would use it is by giving participants a task, then asking them to highlight things that are clear or confusing. We could define any other dimensions or style lenses as well: e.g. phrases that bring more confidence and less confidence.
Then we map all highlights into a heatmap to identify patterns and trends. You could run it with print-outs in-person, but it could also happen in Figjam or in Miro remotely — as long as we have a highlighter feature.
Run Moderated Testing Sessions
These little techniques above help you discover content issues, but it doesn’t tell you what is missing in the content, and what doubts, concerns and issues users have with it. For that, we need to better uncover user needs.
Too often, users would say that a page is “clear and well-organized”, but when asked specific questions, you notice that their understanding is vastly different from what we were trying to bring into spotlight.
Such insights rarely surface in unmoderated sessions — it’s much more effective to observe behavior and ask questions on spot, in-person or remote.
Test Concepts, Not Words
Removing doubts before they happen with front-loading key details.
Before testing, we need to know what we want to learn. First, write up a plan with goals, customers, questions, script. With them, don’t tweak words alone — broader is better. In the session, avoid speak-aloud as it’s usually not how people consume content. Ask questions and wait silently.
After the task is completed, ask users to explain a product, flow and concepts to you. But: don’t ask them what they like, prefer, feel or think. And whenever possible, avoid the word “content” in testing: people often perceive it differently.
Choosing The Right Way To Test
There are plenty of different tests that you could use:
🍌 Banana test — Replace key actions with “bananas”, ask to explain.
🕳️ Cloze test — Remove words from your copy, ask users to fill it in.
🤔 Reaction cards — Write up emotions on 25 cards, ask users to choose.
🃏 Card sorting — Ask users to group topics into meaningful categories.
🖍️ Highlighting — Ask users to highlight helpful or confusing words.
🥊 Competitive testing — Ask users to explain competitors’ pages.
When choosing the right way to test, consider the following guidelines:
💭 Do users understand? Interviews, highlighting, Cloze test
Do we match the mental model? Banana testing, Cloze test
What word works best? Card sorting, A/B testing, tree testing
Why doesn’t it work? Interviews, highlighting, walkthroughs
Do we know user needs? Competitive testing, process mapping
Wrapping Up
In many tasks, there is rarely anything more impactful than the careful selection of words on a page. However, it’s not only the words alone that are being used, but the voice and tone that you choose to communicate to customers.
Use the techniques above to test and measure how well people perceive content, but also check how they perceive the end-to-end experience on the site.
Quite often, right words used wrongly on a key page can convey a wrong message or provide a suboptimal experience. Even although the rest of the product might perform remarkably well, if a user is blocked on a critical page, they will be gone before you even blink.
Useful Resources
Practical Guide To Content Testing, by Intuit,
How To Test Content With Users, by Kate Moran
Five Fun Ways To Test Words, by John Saito
A Simple Technique For Evaluating Content, by Pete Gale
Last Early Birds: Live Measure UX Training
I’ve been spending quite a bit of time reviewing and drafting new sections for the video courses on UX: