[ Skip to main content ]
← Back to blog

User testing from home

 |  Experience + design

Moderated user testing is something most traditionally done in person. The ability to read body language, recognise appropriate timing for follow up questions or probe a little further into something the user paused over, are all much easier when you’re sitting in the same room.

That being said, different physical locations, timeframes and budgets often impact on the ability to conduct testing in person, and recently, due to the Alert Levels in New Zealand due to Covid-19, this was our only option.

In early April we did a round of remote, moderated user testing for a project we are working on, and below is the process we went through and the differences and learnings from the experience.

Setting up for user testing

Funnily enough, for this particular project we had planned for remote user testing anyway, due to physical locations being a barrier and limited budget for travel. However Covid-19 did still have an impact:

  • On one hand, at the beginning of the “lockdown” in NZ, people were feeling heightened levels of stress, and many were going through a steep technical learning curve to begin working from home and having meetings over video - meaning that we had to work hard to create a safe and easy experience for them
  • On the other hand, arranging times for tests was infinitely easier and people were willing to stay on for longer, due to the increased time they had at home. (It’s important here to note that this was very early on in the New Zealand Alert Levels system and so people were potentially feeling a lot less “Zoom exhaustion” and were less settled into working from home than if this were to be done now, as we head into Level 2).

We worked with our client to arrange the user testing sessions, as we would in normal circumstances. Again, due to the steep learning curve people were going through due to the transition to working from home, we included in the email very clear instructions that there was no “homework” or preparation for them to do, and they did not need any special technology or apps to partake. We let them know at this stage prior to the test that they would be asked to share their screen during the call, and that we would record it, and gave an option to opt out of this.

Identifying test participants

This was almost exactly the same as in usual testing - our participants were based on the website user personas to ensure we had realistic representation of the users. Our other consideration was, again, the technical capability of the users and their familiarity with Zoom — however we didn’t exclude/include participants due to this, it just meant considering how we prep and interact with them.

Prototype for testing

This particular test was conducted with wireframes for a new website build to get earlier user feedback on the information architecture and key content of the website.

I set up the prototype using Invision with the key wireframes loaded up and linking between key screens. Having trialled out other prototyping tools I keep coming back to Invision for its simplicity and the fact that it fits in easily with my existing workflows.

Gif showing a wireframe of a website home page, with the user clicking the menu and navigating to another page.

Just like with anything, there was a need to make a couple of tiny tweaks during the testing, in this case to add some buttons separate to the wireframes to more easily navigate to certain parts of the prototype itself as we weren’t able to drive that for them.

Testing

We used Zoom for all of our tests as it seems to have emerged as the go-to video calling service through lockdown. Each participant joined the call and was given a brief introduction (the classic user testing intro - “This is a test of the wireframes, not of you” etc etc) and asked the foundation questions. At the start of the call I had sent them the Invision prototype link, and once the intros were out of the way we asked them to open that prototype and share their screen - this being the closest way we could replicate sitting next to them while they browse.

The good thing with screen sharing in Zoom was that while they were sharing their screen I could have Speaker View enabled to also see them at the same time. While this wasn’t exactly the same as reading body language in person, it did help to at least see the participant while they were navigating through the prototype and any confusion or pauses they had.

As with a typical user test we had two of us in the call - I facilitated the test and our client took notes and observed. We also recorded the calls for future reference.

Biggest challenges

  • Obvious technical considerations of reliance on internet, sound delays etc
  • Not being able to read body cues as easily over a call
  • Considering the process that the user will go through the prototype, as you have less ability to guide (if moving from one task to another)

My key learnings are to be as prepared as possible and have backups - if video doesn’t work can the test work with audio only? If the user cannot share their screen how would it work if I shared it and let them drive. And then based on these options, how to then treat the findings - for example feedback from a user who drives the prototype themself vs a user who comments while the tester shares their screen.

Would we do it again?

Absolutely! While there are some aspects that are hard to replicate from in-person user testing, there is certainly a place to continue moderated user testing remotely. User testing is such an important part of the process and the ability to widen the pool of users you can test with is a major benefit.


Share this post:

We'd love to hear from you

Technology enables us to do some awesome things. We love how a simple greeting or question can turn into something amazing.

Contact us