Remote User Testing in the Age of COVID-19

RW Patel
3 min readJul 26, 2020

Product engineering doesn’t slow during a pandemic. In fact, connected systems face increased usage and security demands.

Remote user testing has stepped up as a decision tool for product teams using continuous monitoring for production systems. Leveraging user testing with continuous monitoring helps teams understand how changes may affect the performance of online systems and the intended user experience.

To effectively run remote user testing, a testing framework is an important function for product teams to get right. While statistical analysis can provide inference for products, a user testing framework helps to understand the impact of any variation on user behavior. Remote user testing strives to align statistics in real-time environments with the data generated from continuous monitoring.

Case in Point: User Tests in Online Learning

Company: An American university is a leading engineering school providing lecture and lab based education in a traditional college campus setting. UX research studied online lectures against a control group of traditional classroom instruction in addition to traffic from their learning management system.

Solution: Without the constraints of semester based study, the user tests collected feedback on student engagement with the LMS interactive features in the online classes. Qualitative surveys were conducted periodically with both student groups to understand why or why they didn’t reach interactive levels and reach learning outcomes. Results were created from both offerings to adjust for the two different student body needs.

Business Benefits: Through remote testing, the business is able to prioritize features in the LMS to reach learning at the same level as in person classes. Student needs were defined for each environment and ongoing studies were planned to ensure outcomes remain consistent.

Summary: Remote user testing serves as a vehicle that makes it possible to understand analysis of user outcomes, explore data with users pre-release, and train accurate machine learning models for production systems.

How remote user testing has changed during COVID

In traditional scenarios, test data is collected at a single point in time. The teacher evaluates the difference in student grades after a class and a researcher might determine whether the learning platform was effective once the full course of instruction is administered. In this case the statistical significance to the LMS or curriculum isn’t fully realized.

Yet, production platforms don’t operate on a fixed horizon and much of the data collected from traditional statistical significance is hard to apply in one product release. Rather results from remote user tests are calculated and updated live, as data is collected, causing statistical significance to change more frequently and vary widely as more visitors participate in the studies.

Quantifying remote user tests

If you run user tests with an in-house solution or tool that uses traditional statistics to calculate results, its necessary to run tests over longer periods of time to prevent capturing false results. This ensures the product has adequate data to validate benchmarks for the product release.

To come up with a sample for testing, you need to have your product benchmarks defined. You also need an expected performance metric, you want to see from your tests. These two numbers will help to predict the needed user testing for the goals your trying to achieve with the product.

In summary, it’s unknown what effect a variation or change in production might have on user behavior, so committing in advance to a hypothetical outcome from a feature isn’t realistic under any circumstance from your continuous monitoring systems.

Choosing a remote user testing cadence that fits the product release and sticking to it, however imperfect, is the best way to ensure that your results are statistically valid over time and that you’ve avoided statistical errors if you’re using continuous monitoring platforms.

--

--

RW Patel

Helping teams find digital relevancy through user experience and data-driven insights.