User research vs audience research

Last Wednesday, I asked why so few museums seem to be regularly testing their digital products. List member Randi Korn — founding director of RK&A and author of Intentional Practice for Museums: A Guide for Maximizing Impact — replied to that email (shared with permission):

There is an underlying assumption in what you wrote that all museums evaluate their exhibits and programs. This isn’t the case. Some do, most don’t. Some museums evaluate their work because they are told they must—from the foundation or agency that gave them the money. Others do so because they are truly interested in the effect of their work on the public and want to learn from their work. 

Randi’s comments made me realize that I was using the word “evaluation” to describe a wide range of activities — from exhibition evaluation to the surveys and focus groups that many museums conduct to gauge audience satisfaction. I was conflating many motivations and intentions — from measuring exhibition impact on learning to benchmarking audience satisfaction — and calling that “evaluation.”

Randi continues:

Testing digital experiences, of course, should be part of a digital project; like exhibition evaluation, some museums test them but most do not. Museums need to set aside resources to evaluate digital products (between 10% and 20% of a budget), but the truth is, most would rather put that money towards the product; the same is true with funders—they want evaluation but do not want to siphon off programming money. Things may change when these two actions happen simultaneously: 1) a funder says, “demonstrate through evaluation that you are achieving what you said you would achieve,” and 2) a funder provides the resources to do so. 

I’ve been thinking a lot about this over the past several days and wanted to run some thoughts by you.

What is the relationship between the sort of evaluation a user experience (UX) researcher does and the exhibition evaluations that museums are familiar with?

The goal of any evaluation in user research — that is, studying how products or services are understood and used by an audience — is to generate more revenue and/or decrease costs for the organization. You could argue that improving the user experience results in a happier customer, which has some value in itself, but ultimately the goal isn’t just a better experience. Improving user experience is a way to achieve economic benefits for the organization. 

As Randi points out, museums often conduct evaluations to gauge audience impact and learning and/or to meet funding requirements. Audience research can improve the exhibition, or it can help get more funding.

Maybe that’s the big difference between the two.

Museum evaluations aim to improve audience outcomes (education/impact) and facilitate economic support from funders.

User research aims or improve user experience and facilitate economic support from patrons.

When a museum evaluates an exhibition, it may find opportunities to improve an exhibit to produce a better experience for visitors, but I imagine those improvements will have little impact on earned revenue for the museum — certainly nothing all that measurable. The benefits of user research, on the other hand, are measured in dollars and cents.

That’s why, when I hear that museums would prefer to spend as little as possible on studying their digital products, I’m both disappointed and delighted.

I’m disappointed because neglecting digital seems short-sighted; I’m delighted because this confirms my hunch that there is so much room for improvement in this space. It seems that the sort of user research that’s brought tremendous financial gains to other industries — like banking, ecommerce, and tech — has not reached many visitation-based membership organizations. 

On Wednesday, I’ll share the two main benefits I think user research can bring to these types of organizations.

Thanks for reading,


PS. If you think I’m right on — or way off — hit reply and let me know. Comments like those above from Randi have been incredibly valuable to me. I’d love to hear from you, too.

Kyle Bowen