IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Transforming Training: Austin Uses VR for EMS Responders

Austin-Travis County Emergency Medical Services in Texas has incorporated augmented and virtual reality into its training process, allowing first responders to prepare more effectively for mass-casualty events.

A paramedic inside an emergency vehicle speaking to a patient who is lying down.
Training for Austin-Travis County Emergency Medical Services (ATCEMS) responders in Texas looked very different prior to the addition of augmented and virtual reality technology in the department.

According to Commander Keith Noble of ATCEMS, traditional training involved a PowerPoint presentation and a walkthrough of the department’s ambulance bus (Ambus). There were plans for a large mass-casualty event training in the works for several years, but it would have involved bringing in 600 employees on overtime pay for several hours to complete it.

Through Noble’s leadership, ATCEMS identified a better way of doing things through AR and VR, which would lower costs and allow for more versatile training, as new scenarios can be added. Moreover, AR and VR training can be frequently repeated, as opposed to the costly traditional method, which could be done once and not again for years — if ever again.

The AR/VR project was set in motion following a $5,000 grant from Austin’s Innovation Office about four years ago, said Noble. The grant led to an initial demo, a partnership with researchers at Texas State University and, ultimately, the formation of Augmented Training Systems (ATS). However, it was the US Ignite grant announced in January 2020 that allowed ATCEMS to arrive to the type of advanced training that is offered now.

ATS President Scott Smith cited a significant improvement in training speed and accuracy when using AR and VR technology. Trainees now learn the material 40 percent faster with a 35 percent reduction in errors compared to the days of traditional training. And the option of repeating training has already proven to be popular. On average, people have used the new training tech over 15 times in a week. Smith also noted how commanders have benefited from the new dashboard, which displays when cadets access training, how long they spend on it, the number of errors and even time spent on a specific task.

“Then that data is aggregated and compared across thousands of people,” explained Smith. “So you have some sort of standardization of training, but also a firm picture of what optimal performance might look like in a peer group.”

The technology allows responders to train in several scenarios. According to Ted Lehr, IT data architect for Austin’s Communications and Technology Management Department (CTM), the augmented reality component is very beneficial for the Ambus training, as it allows individuals to familiarize themselves with the spatial layout and location of supplies in the Ambus.

The virtual reality component, on the other hand, is particularly helpful with training that requires less movement, like treating patients. Smith said ATS training features 30 avatars and 150 randomized symptoms, which creates more immersion than working with a replica with written labels describing symptoms. He also noted that VR images will only become more realistic because the technology is quickly evolving.

One unique component of this technology, as described by Smith, is the ability to introduce stress into the trainings in order to reflect actual dynamic situations. For example, noises and other stimuli in the background may impact a responder’s cognitive load, that is, their amount of working memory. Adding similar stressors in training helps commanders and responders determine their limits in cognitive load.

The emotional impact of experiencing these situations, even in a VR or AR setting, can help prepare responders for difficult situations they might face in a mass-casualty situation. A significant example described by both Smith and Lehr was the best practice of resisting one’s natural instincts and temporarily ignoring a child during a mass shooter scenario in order to go find the shooter, an approach that could potentially save more children.

Smith has spent a lot of time determining how these simulations impact individuals physiologically and psychologically. He acknowledged new research on how active shooter drills can be traumatic. Lehr echoed this sentiment, stating that while in many ways the training is not a replacement for the real thing, in a very dangerous training exercise, it can be.

The combination of traditional training and AR/VR training may initially be more costly, noted Lehr, but it will improve the quality of the training and, ultimately, the quality of service delivered by ATCEMS.

Implementing the new technology has come with a few major challenges. One challenge has been getting people to buy into the benefits of using the tech, but Noble acknowledged that younger cadets have shown less resistance to it. The biggest hurdle, according to Noble, was that the people programming the system lacked medical backgrounds. Implemention required close collaboration between Noble and the team developing virtual scenarios to ensure that everything was correct.

The partnerships in the project formed naturally through ATCEMS’s existing relationships with Texas State University researchers, who brought their VR expertise to the equation. These researchers, led by Smith, formed ATS. ATS did the programming to create the mass casualty training for ATCEMS to demo, and Noble helped with providing subject matter expertise. CTM, meanwhile, provided technical services and support.

Smith described the potential for expanding the training, as ATS is developing new sessions for other partnerships and scenarios, including more active shooter trainings for various organizations. Smith also noted the possibility of utilizing the technology for use-of-force training to help individuals learn how to engage with people suffering with mental health issues. In addition, ATS plans to create hazmat and infectious disease scenarios and, potentially, wildfire response events — all situations that are hard to simulate safely without AR and VR technology.
Julia Edinger is a staff writer for Government Technology. She has a bachelor's degree in English from the University of Toledo and has since worked in publishing and media. She's currently located in Southern California.