IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Successful Government Leaders Adopt Design-Focused, User-Centric Way of Working

The simple practice of observing how a user actually works during an average day can sometimes yield more useful information than hours of interviews.

Editor's note: The following is an excerpt from William Eggers' new book, Delivering on Digital: The Innovators and Technologies That Are Transforming Government, released June 7, 2016.

In business and increasingly in government, there’s a growing realization of the importance of design in everything from customer experiences to societal problem solving.

“Design is an approach to problem solving,” says Hillary Hartley, deputy executive director at 18F. “It’s how you think about something. It’s not the typefaces; it’s not the pixels. Design is what makes a product successful. It’s the thing that makes it useful, that makes it understandable.”

Since its inception, 18F has emphasized and adopted a design-focused and user-centric way of working. They’ve incorporated a technique called protosketching: Designers and developers build a rough prototype in three hours or less by sketching in code as well as on paper. Even if the protosketch is imperfect or outright unusable, it gives teams and clients something concrete to examine and elevates the discussion to issues of data, design and functionality.

The United Kingdom’s Government Digital Service mirrors 18F’s approach, articulating its vision through 10 concise design principles based on actual user needs:

  •   Start with needs (user needs, not government needs).
  •   Do less.
  •   Design with data.
  •   Do the hard work to make it simple.
  •   Iterate. Then iterate again.
  •   This is for everyone.
  •   Understand context.
  •   Build digital services, not websites.
  •   Be consistent, not uniform.
  •   Make things open. It makes things better.
Today, this is how the United Kingdom undertakes all of its digital projects. But that wasn’t always the case. Kathy Settle recalls that during the Gov.uk transition, as they were moving 312 websites onto the Gov.uk domain, they initially relied on department personnel to explain users’ needs.

Unfortunately, “quite often the organizations weren’t in direct contact with their users, and their articulation of user needs reflected that,” Settle explains. “Things were built on the basis of what people thought user needs were. Some of it turned out to be wrong.”

Interaction with end users adds a depth and authenticity that’s essential to good design. The simple practice of observing how a user actually works during an average day can sometimes yield more useful information than hours of interviews. The idea is to walk a mile in the user’s shoes — or get as close to their experience as possible.

Walk In Their Shoes

So how do you understand train travel? You ride the train.

When Amtrak decided to redesign its customer experience — then including three online portals, each with a distinct audience — it made sure the redesign focused entirely on the user. This involved a brisk and geographically expansive research project.

In the course of a few weeks, teams of “user experience” researchers rode Amtrak trains across the United States, interviewing passengers and staff, and visiting stations along the way, covering the Northeast Corridor, the entire Pacific coast, and the South from New Orleans to Houston. Two weeks and more than 100 interviews later, the team had uncovered valuable insights, witnessing problems firsthand.

“You hear that certain areas are broken. … [W]hen they say something is working well, it’s because they have workarounds that are completely counterintuitive to someone who’s actually designed a digital platform,” says Mark Waks, one of the project researchers. “But you need to see that to know what it is that’s actually broken.”

For example, customers contacting Amtrak call centers could get a quick cost estimate for travel on any route. The researchers learned, however, that call center workers were jotting down customer requests on paper and generating estimates with a calculator. Behind-the-scenes information such as this can direct you to areas in dire need of transformation.

In short, user research reveals what structured interviews may not: that a seemingly one-dimensional problem can have hidden layers.

User validation is also critical in designing a better experience, which is why agile design is so helpful. Testing prototypes with actual users at every step helps gauge how well a problem has been addressed. “In products where we employ digital, we bring in users every step of the way as much as we possibly can,” Waks says. “Otherwise, you’re basically just judging it on your own personal perceptions, not the users’.”

The design stage is in many ways a blank canvas on which organizations can define their ambitions based on their resources and goals. Some will take a more expansive approach, completely rethinking how services are delivered, while others will pursue smaller projects, such as allowing citizens to upload forms and documents via mobile apps. In either case, the key is to understand users by studying their behavior, and to design approaches to reduce or eliminate the pain points you’ve observed.

Eliminating the Pain Points

Let’s start with a relatively simple example. Determining eligibility and applying for government benefits can be time-consuming and frustrating. It typically involves finding and scanning payroll forms and birth certificates and carrying or faxing them to multiple offices. If you’ve ever applied for a home mortgage, you have a sense of what this can entail.

In 2011, the Texas Health and Human Services Commission (HHSC) simplified this process by installing a statewide integrated system that aggregates eligibility for various federal and state programs by using an integrated rules engine. This allows a single mother, for example, to apply for multiple benefits with one application. The system rules assess the programs for which she may qualify based on her income, household size and other factors.

With more people from all income levels becoming comfortable with mobile transactions, HHSC began exploring how it might bring its integrated eligibility functions to the mobile arena. Older models of IT development might have entailed a multiyear, multimillion-dollar effort to build a mobile-friendly version of the service.

Instead, HHSC focused on the user perspective, asking what would eliminate a pain point for users but also benefit the agency. Unsurprisingly, one problem applicants often cited was the need to submit verification documents. While they could do so by mail, fax or the Web, many applicants didn’t have easy access to scanners or personal computers. Perhaps surprisingly, however, many do have access to smartphones with cameras.

Since banks have long allowed their customers to deposit checks by taking a picture of the paper check with their phone and hitting the upload button on an app, why couldn’t applicants do the same thing? HHSC’s team knew it was technically possible. But would applicants actually use such an app? And what kind of experience would attract the most participation?

These aren’t the kinds of questions we can answer in advance. They require time in the field, talking with real users. In HHSC’s case, it meant spending time in the service centers that many applicants visit when applying for benefits.

There, HHSC’s designers learned a lot about the people they hoped would use the app. Most benefit applicants did indeed have smartphones, but their devices were often a generation or two behind and thus lacked advanced capabilities. This turned out to be a critical piece of information in designing the app. Most were intimately familiar with their phones’ capabilities because it was their primary means of connecting to the Web. “Many users were used to conducting their business on mobile devices instead of personal computers, making them sophisticated users,” explains Stephanie Muth, the deputy executive commissioner at HHSC who spearheaded the project.

The team introduced its first set of wireframes to users just two weeks after the project began. Demand was high; applicants quickly snapped up the software. The team gathered valuable information from early users that helped it refine the user experience and design features to be included in later builds.

In a few short months, the agency released the app in the iTunes and Google Play stores. Almost immediately, the mobile versions took off; within a month, mobile document uploads surpassed those from desktops. A few months later, the app had been downloaded 300,000 times. And HHSC released five versions in the first year alone, each with additional functions and a better user experience.

Several lessons emerge from this experience. First, don’t make assumptions about how those who are less fortunate use — or don’t use — technology. Instead, test your assumptions with real users and continuously ask them for feedback. Second, start small and get a minimum viable product to users as quickly as possible.

If Texas’ HHSC project occupies the easy end of the complexity spectrum, Amtrak lies at the other. HHSC wasn’t imagining entirely new benefits or creating a program from scratch; it simply wanted a more efficient way to enable self-service.

Amtrak’s attempt to reinvent its services with an entirely new brand and customer experience is far more ambitious. What if customers could access Amtrak with desktop, tablet and mobile devices? What if they could use an app to request an Uber ride, reserve a table or seat, order and pay for meals, purchase amenities, and even receive alerts for sights to see along the journey? Amtrak found answers and added them to its redesign.

“We’re selling a journey, an experience and not just a ticket on a train,” says Deborah Stone-Wulf, VP of sales distribution and customer service at Amtrak. “It has taken us a bit of time to understand that, but we do understand it, and that’s where we’re headed.”

In this sense, Amtrak’s transformation relative to HHSC’s is akin to comparing telescopes and microscopes: The components and processes may be the same, but differences in scale and objectives change everything.