IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Your New Web App Looks Great, But Is It Usable? (Opinion)

Conducting user research on online experiences can help government build public trust and increase perceptions of transparency.

State and local governments rely on technology to educate and empower constituents. Online experiences (Web and app based) that are quick to access and easy to learn increase public trust and perceived transparency. Government agencies that shape their online experiences to the needs and expectations of their users stand to gain support and confidence of their constituents. User research provides insight into where these experiences need improvement. 

Members of the public utilize government websites and apps to find information on services, as well as to complete tasks such as searching and applying for jobs, paying taxes, and requesting permits for various activities. Government agencies that do not address the usability of their online experiences stand to frustrate community members and cause more work for staff in the form of customer service calls and visits. 

Methodology

All agencies in charge of designing or implementing online experiences should have regular access to a researcher or team of researchers. The toolbox of user research methods is vast, with each method aligned to the specific question needing an answer. It is not enough for staff to rely on their intuition in designing online experiences or conducting research. Trained researchers need to design and carry out studies, as well as analyze data and interpret the meaning of results.
 
Two common methods employed by user researchers are contextual inquiry and usability testing.
 
Contextual inquiry involves observing and interviewing users in the setting they would typically access an online experience. Contextual inquiry is useful for answering questions around how, when, and why users interact with an online experience, as well as what a day in their life looks like. Entire books exist on conducting contextual inquiry. It takes a trained researcher to design and conduct a useful contextual inquiry study.
 
Let’s say your agency runs an online job placement or benefits application website. You know that many users utilize libraries, classrooms and other public spaces to access the Internet on a computer. You conduct a study with users at a local library. Your key findings highlight how uncomfortable it can be to enter personal data in a public setting, and that many users do not remember to log out of an application after they complete it.
 
Your researcher analyzes this data and notes the lack of security measures in your current online application. Fortunately this is something you can directly address in your design. You can provide messaging to users that reminds them to log out after completing their application. You can design the system to log a user out after five minutes of inactivity. You can also remove fields asking for personal information such as Social Security or driver's license numbers, as this is information that can be collected at another time.
 
Each of these adjustments comes in direct response to real-life user observations.
 
Usability testing involves asking representative users to complete specific tasks. Usability testing can reveal areas of difficulty in a task, the expected amount of time it might take users to complete a task, unclear language in instructions, and the overall level of difficulty for users to learn how to use a new service. 
 
Usability testing can begin as soon as designers have created wireframes, which researchers use to test designs before adding color or finalizing what the branding will look like. This allows users to focus on the details of completing the task, not whether they think the design is pretty.
 
Let’s say your city is designing a new workflow for making property tax payments online. Usability testing reveals that a number of users have difficulty understanding some of the field labels and what information they need to enter. Additionally some of the users tested state they want to return to the account overview page once they make a payment. The current workflow returns them to the homepage of your city’s website. 
 
Your researcher analyzes the data and recommends the following: 
• provide discrete tool tips next to field labels explaining what information needs to be entered
• re-label some fields to be consistent with user expectations
• offer multiple paths directly from the last screen of the payment workflow (e.g. buttons to return to “account overview” “City homepage” etc.) rather than forcing users back to a specific screen
 
These examples highlight the potential value of engaging in user research. Many additional methods can inform design, reveal usability issues and validate your assumptions. 

Look Within

Government budgets are always tight. Leaders should ask whether including user research is worth the cost. There are substantial upfront costs. For example, if an agency typically designs products without input from the public, a shift in strategic thinking is required. This starts with top-level executives and management staff establishing policy to support a climate where user research thrives.
 
There are financial resources spent on either hiring researchers on staff or outsourcing research needs by project. Budgets will need to factor in these costs. I argue these costs will be easily recouped. According to Usability.gov, 15 percent of IT projects are abandoned, and programmers spend 50 percent of their time correcting mistakes. Conducting user research reduces these abandonments and mistakes.
 
User research just makes sense. The return on investment includes both financial and political capital.
 
victor-yocco.jpg
Victor Yocco is a researcher and strategist at Intuitive Company, a research, design and development firm in Philadelphia. He received his Ph.D. from The Ohio State University, and writes frequently about digital strategy, leadership and the application of theory to digital design.