Testing assumptions

Museums understand the value of studying their programs and exhibitions. The idea of examining audience behavior the museum’s impact is familiar. (If you work at a museum, you may be thinking, “Duh, Mr. Obvious.” But in many environments, the idea of studying audience behavior to facilitate better outcomes is entirely foreign and faces real resistance.)

People in the museum world understand that expertise should be subjected to testing because even experts can’t know what they don’t know. Evaluation uncovers assumptions and knowledge gaps.

Evaluation involves studying how people interact with exhibition content. Evaluators seek to gauge comprehension and uncover opportunities to improve exhibition design and materials.

Evaluators and user experience (UX) designers share similar goals. They want to improve a product — whether IRL or digital — to facilitate better outcomes for the audience and the organization or business.

One of the more valuable tools for uncovering opportunities and surprising new perspectives is user testing.

What is user testing?

User testing lets you observe real people who have no experience with (or vested interest in) your organization interact with your products.

(If you aren’t already thinking about your digital content and design systems as products, I hope you’ll try it out. Thinking of content as a product can introduce a new level of accountability. It can clarify things — you start asking, “Why are we doing this?” and “Can we measure the value of this?” a whole lot more.)

User testing can be moderated or unmoderated.

Unmoderated user testing is pretty inexpensive and is less prone to bias. In an unmoderated test, a user goes through a series of predetermined tasks, sharing their thoughts and feelings as they go. The session is recorded for later review.

A moderated test is just what it sounds like. The advantage here is that a moderator can jump in and ask more questions — “tell me more about that” or “where have you encountered that before?” It can assume the characteristics of an interview.

BRAFYA

When I ask museum folks if they’re studying how people are using their digital products, one of the common answers I hear goes something like:

“We did some testing when we did our Big Redesign A Few Years Ago, so I think we worked out the kinks then.”

I heard this so often, I began thinking of it as BRAFYA.

(I know. I need to work on my acronym game.)

The frequency of the response suggested that, while museums may understand the value of regularly evaluating programs and exhibitions in terms of audience needs, they haven’t extended that to their digital products.

Welcome to the jungle

Think of the museum as an ecosystem. The environment itself remains much the same — a forest remains a forest; the museum’s building maintains its core structure. But the species in that environment are evolving.

New species invade; others go extinct. Exhibitions come and go.

The same goes for a museum’s digital content. The website domain and core structure may remain the same for years at a time, but the content and systems that support interactions evolve.

It’s hard to imagine a museum closing itself off to audience insights for years at a time — no summative evaluations, no exhibition surveys, no focus groups.

So why does there seem to be an assumption that their ever-changing digital content, which supports revenue-generating tasks online, doesn’t need similar attention? Why is it ok to (maybe) run some user tests during a major website redesign and never again?

That would be a bit like only conducting surveys or evaluations around whatever exhibitions happened to be going on during a major building renovation.

Now, you might be thinking:

“We’re always talking with visitor services and our designers … We don’t hear many problems about our website. This doesn’t really apply to us.”

You might be surprised.

Much like analytics, user testing uncovers problems and opportunities that will never show up in a survey or at your help desk.

I say problems and opportunities because testing isn’t just about uncovering technical issues. I’ve seen exciting new ideas and marketing opportunities come out of listening to visitors verbalize their thoughts as they interact with content.

What you find through user testing is often actionable and has a direct influence on earned income. It's common to find quick fixes related to interaction design or comprehension. In other cases, you’ll catch glimpses of entirely new value propositions that can shape future marketing efforts.

On Friday, I’ll share a glimpse of what testing can uncover.

Thanks for reading,

Kyle

Kyle Bowen