Design research night at Crabtree's restaurant

Let’s talk about the survey I shared with you yesterday. You’re probably distributing surveys to your audience, right? So, maybe we can compare notes on surveys in general by looking more closely at this one I’ve made in particular.

Why survey?

Why would I be surveying people who work at museums when the point of this research project is to understand the motivations of museums’ constituents?

The purpose of asking organizational leaders about audience motivations isn’t to find out why people visit, become a member, or donate — it’s to find out what leaders believe leads people to visit, become a member, or donate.

My feeling is that there is a knowledge gap between what organizational leaders believe motivates people to visit and what actually motivates visitors, donors, and members. That hunch is informed by conversations I’ve had with museum decision-makers and my experience working with and in other nonprofits.

The first thing I need to do is to test my beliefs by gathering information from folks at these organizations. The purpose of the debiasing survey is to find out what people believe about audience motivations and to get a glimpse of what might be informing those beliefs.

In short, the survey helps me questions my own beliefs and biases.

What to ask and how

Deciding how to ask those questions has been more difficult than one might expect.

I can’t just ask questions that speak to what I really want to learn. I’d have to ask things like:

  • “What Jobs do members hire your museum for?”
  • “How does your organization help people become better versions of themselves?”
  • “So, you all been doing much design research lately?”

Trust me — you don’t want to flat out ask people about design research. You can hardly even talk to them about it.

Don’t believe me?

Last Christmas, I tried my design research standup act at Crabtree’s — a restaurant here in the Village of Huntington — during one of their open mic nights.

Open mic night flyer

None of the jokes landed. Zero.

“What do you get when a bunch of people who have a cold take your survey? Statistical sniffificance.”

“Why did Rudolph’s nose shine so bright? Heat maps suggested that lots of people were interacting with it, but without any user testing no one really nose why.”

A few people got up from their tables and started walking toward the bathrooms. I tugged on the Santa beard I had rented for the evening.

“You know, a lot of you aren't getting any gifts from Santa this year — just FYI.”

The restaurant manager was approaching. I didn't have much time.

“How could Santa know what gifts you want? He can’t track your activity since you cranked up the privacy settings in your browser and you're not leaving him any cookies.”

Christie nearly left me. Jasper demanded a paternity test.

The point is, you don't want to assume that people will relate to, or will even understand, the vocabulary and ideas you take for granted. Your survey questions have to be super simple.

At the same time, the problem with super simple survey questions is that they can lead to super simple, superficial answers. Few people are going to be slowing down and thinking deeply in a survey like this, and simple questions may encourage that fast, simple thinking.

The most I can likely hope for is that the survey will capture people’s thoughts in their own words and let me suss out patterns from that input. It may also uncover how things changed based on visitation numbers, type of museum, and the individual’s role.

I’ll need to do some follow up interviews with people who took the survey to get a better understanding of their context. It would be misleading and even unfair to compare survey data on organization leaders’ beliefs to the more in-depth picture I’ll get from interviews with members and donors.

Distribution

I’m primarily using LinkedIn to distribute the survey. LinkedIn’s Sales Navigator lets me find people based on criteria like:

  • Geography: United States
  • Industry: Museums & Institutions
  • Company headcount: Not self-employed
  • Seniority Level: CXO, Director, Partner, VP
  • Function: Administrative, Marketing, Research …
  • Title: Here I enter a bunch of exclusions to help ensure the people are relevant

That search turns up thousands of results. I fed the result into LinkedProspect, which is supposed to be a sales prospecting tool, but I’m using it to distribute this survey.

LinkedProspect will invite the people in my search to take the survey through a connection request. The nice thing about this is that people don’t have to accept the connection request to take the survey.

Ultimately, if some people are not qualified to take the survey, I expect the survey questions will make it clear to them that it’s not for them, and, if they do take it, their answers will likely reveal to me that I should exclude them from the sample.

I’ve queued up 2,505 people to take the survey using this method; The survey will be sent to 50 people each day. So, assuming I let it run to completion, everyone from this batch will have received the survey by early September.

Survey design

Many of the decisions about the survey’s design are informed by my experience making forms for clients.

I know enough about creating forms to know that there’s a lot of unanswered questions as to how I might improve this survey’s design.

Take the order of the questions, for example. At first, I had the questions related to investment in audience research first. My thinking was that opening with a yes/no question would help people get a foot in the door and gather some momentum, which should make it more likely they would complete the survey.

But asking about past research could make it more likely that people’s answers about audience motivations would be more colored by reflections on their research efforts. Asking about motivations first might give a more true view of how any research might be informing their beliefs. Of course, participants could answer why they believe people are visiting, joining, and donating and then, once they’ve answered the question about research they’ve invested in, they could go back and change their answers about motivations — but I don’t think that’s likely.

The nice thing about creating things like booking forms is that you can keep redesigning based on user behavior. I can’t really do that with this survey without compromising results, which makes me uneasy.

So far, the completion rate on the survey has been 68%, which is pretty good. In the days ahead, I may share a preview of results with you …

But this email is getting long.

I can feel the restaurant manager approaching.

Thanks for reading,

Kyle