How to Evaluate the Litigant Experience as Courts Turn to Online Dispute Resolution

An expert discusses ways in which leaders can test the usability of their systems

Navigate to:

How to Evaluate the Litigant Experience as Courts Turn to Online Dispute Resolution
Gavel on laptop
Getty Images

State courts are quickly adopting online dispute resolution (ODR) as a platform that can help parties resolve everything from traffic disputes to small claims lawsuits—without ever setting foot in a courtroom. But in the still-early stages of adapting these tools to management of legal issues, court leaders have much to learn from evaluating people’s experiences with the systems.

Stacy Butler directs the Innovation for Justice Program at the University of Arizona, and her research focuses on how to design technologies that work well for people, including those seeking to take advantage of ODR. Butler has two decades of experience in community advocacy and efforts to expand the reach of civil legal services. Earlier in her career, she worked for the U.S. District Court for the District of Arizona and as an adjunct professor at the University of Arizona’s James E. Rogers College of Law.  

Butler recently discussed the role of user feedback in court ODR implementation. The interview has been edited for clarity and length.

Q: What are user experience and usability testing, and why are they important for implementing ODR in the courts?

A: UX, or user experience, is the catch-all term for every aspect of an individual’s interactions with an organization, service, or product. The burgeoning field of UX design aims to improve the interface between people and products or services by putting those people’s needs at the center of the development process, a concept referred to as human-centered design. One way to do this is usability testing, a method of evaluating a product or service by observing typical users as they attempt to complete real tasks. Doing so often uncovers problems and opportunities for improvement. In the context of ODR, usability testing helps to assess whether the users can navigate an ODR platform as intended. It also helps to identify tasks or processes where users drop off or are unable to complete a necessary step and recommend changes that can improve ease of use.

Q: Who conducts usability studies, and how are they done? 

A: Anyone willing to learn can conduct a usability study. There are plenty of resources available. A good place to start is usability.gov, a federal website that provides basic information and resources on the topic. Common techniques for existing platforms include observation-based usability testing, which can be done online or in person, in a lab, or in the field. Regardless of the setup, representative users complete a series of realistic tasks they might actually perform. The testers then observe the results.

Q: What can courts learn from testing the usability of their ODR platforms, and how can this information be used to make improvements?

A: Usability testing gives courts the opportunity to “experience” their platforms from the perspective of the intended user. Stakeholders often gain unexpected operational knowledge, such as whether users can navigate important webpages, locate instructions when needed, complete the registration process successfully and quickly, upload documents, and review and approve filings. Courts can also learn whether platform design affects a user’s legal decision-making—for example, whether users are more likely to assert defenses or advocate for themselves—based on how information and instructions are provided.

What courts do with usability testing results depends on what they test for and what they learn. One might discover that the users of an ODR platform can’t find the help button on the landing page or that the registration process takes too long, leading users to drop off before completing the process. UX results can also help courts identify the “pain points” in the platform and provide targeted insights into where the ODR process isn’t working as intended. Those lessons can, in turn, form the basis of new designs and improved functioning. 

Q: What steps can courts take to test the usability of their ODR platforms, who should be involved, and much should it cost?

A: Ideally, courts should be thinking about usability during the initial platform design and invite a wide range of users to weigh in from the beginning. Usability testing can be conducted by court staff willing to learn and apply UX methods, or it can be outsourced to a research institution. The court also could retain a private consulting company. Costs will vary depending on what the court wants to know, how many people the court wants to participate in the testing, and the methods applied.

Q: You and your team at the Innovation for Justice lab recently published a report laying out the results of a usability study of Utah’s ODR platform. Can you share the most important takeaways?

A: We are grateful for the opportunity to work with the Utah state courts on the first UX study of an ODR platform in the U.S. One great takeaway was that the public is eager for this court technology: Nearly every participant said they would prefer to use a website over physically going to a courthouse to resolve a dispute if they had the option. The challenge—and opportunity—is to integrate human-centered design into the expansion of ODR so the technologies employed are useful and intuitive for all users.

Our UX testing revealed opportunities for improvement in the Utah ODR platform. Participants experienced significant difficulty making the transition from paper affidavits and summons to the ODR website. That’s an important discovery, because right now 64% of Utah small claims defendants never log in to the ODR platform. In addition, users experienced errors during registration, struggled to upload and view documents, and expressed concerns about the lack of access to legal information within the platform. We know that Utah is committed to continuous improvement of its platform, and the goal of our evaluation is to provide actionable recommendations for that work.

Q: Without a full-scale usability study, what strategies can courts employ to better understand the experiences of ODR users with the platform?

A: Anytime a court can solicit feedback from its users about newly adopted technology, there will be much to learn from the responses. First, courts need to know the basic demographics of their intended ODR audience. Even locating six to eight people who represent the ideal user and watching them try to use the platform can be instructive. In fact, studies have shown that usability tests with just five participants can uncover 85% of usability issues. Courts can also embed surveys in their ODR platforms to capture direct, personalized feedback about the ODR platform experience from their users.