User testing summary & videos
One thing that seemed to come up in each test we ran: People may not know what an aboretum is, and The Morton’s website doesn’t provide a clear answer to that question up front.
Testing suggested some users didn’t understand that The Morton was an aboretum or even what an aboretum is, in some cases.
This screen shot of the home page on a mobile device highlights that there may be a knowledge gap that isn’t being addressed by the content.
Appeals for support are followed by a list of events — but for truly novice visitors, there’s no simple introduction.
As with all recommendations, testing is best. Does an overarching value proposition and essential introduction to the role of an arboretum contribute to better outcomes? Testing suggests it may be an approach worth exploring.
Eliminate dead ends. Some users visit a particular event page and may then decide they want to attend that event. Try including a call to action at the end of each event page for purchasing tickets to the event or at least to buy tickets for admission. Currently, there is a reference to the cost of the event at the end of event pages, but no link to take action, which means users have to return to the top level navigation and find ticketing info. This friction is especially pronounced on mobile, where testing suggests that a considerable portion of people may avoid using the navigation entirely.
Some users rely on images when they have trouble understanding what the event is based on by reading the text. Use images of real people participating in events wherever possible to help people visualize the character of the activity.
In testing visitation membership organizations’ websites, we’ve found that some users do not think to look for membership options under categories with labels like “Support Us”. Users may think of membership as a benefit for themselves, not the organization, and may become frustrated as they look under a category or menu item like “Visit”.
I’d recommend tree testing the navigation; Short of that, consider breaking out membership as it’s own top-level navigation link in the menu.
On the membership page, the member levels are listed from most expensive to least, which can be a good idea. The most expensive option provides an anchor and makes less expensive options more appealing.
However, in testing, no one clicked to view benefit details. Run heat maps on the membership page to see if this important information is being overlooked at scale.
The Morton might also consider simplifying the list of benefits for each membership level. Describe only what is different between the options.
To reduce cognitive burden, consider emphasizing differences, rather than repeating similarities in benefit descriptions.
Software companies have this nailed down. They present tables comparing benefits, saying “Option b includes everything in option a plus x, y, and z.”
This shortens the list of options and lets people more quickly make a choice.
Users may simultaneously understand that The Morton uses third-party companies to handle checkout and find The Morton at fault with that company’s design choices. One user said, “I don’t know MercuryPay, I don’t know that company …” and went on to question why designers at The Morton would make a checkout form provided by MercuryPay.
In other words, the usability and appearance of checkout pages designed by third parties may negatively impact users trust in The Morton, even when users are aware that The Morton is using a third-party tool.
Feel free to watch and download the full videos below. (To download, you can click the share button while playing a video and choose “download”.) The videos will be available here for at least 30 days from the date of our scheduled evaluation.
A FEW THINGS TO KEEP IN MIND ABOUT LIVE EVALUATION USER TESTS
The participants in these tests are not segmented or screened. The advantage here is that we can be sure these users are first-time visitors and are more likely to catch outstanding usability issues that would surface for anyone who might use your website.
The people taking these tests have at least some experience in testing websites. They are not professional designers, but they may have more experience using the web than some who will be on your site.
Since I have limited access to your payment gateway and website, we’re not able to fully test transactions and email lifecycle content.
Occasionally, user tests reflect the personal opinions of the people taking the test. (Things like: “I like the color” or “I hate this font”.) Take this with a grain of salt. I review these videos carefully and if I find that the personal taste of an individual reflects a real design or credibility issue that might hamper someone’s interactions with the content, I’ll let you know.
Future testing with actual users — like your members or patrons — will help uncover more opportunities to improve the website’s content and user experience.