In Case of Emergency: Humans Overtrust Robots, See Them as Authority Figure

For all the fear of robots taking over, all it takes for humans to follow a robot's authority is a hint of danger.

by / March 9, 2016

Technologists like Elon Musk and Bill Gates worry that robots may one day threaten to extinguish the human race, but there is another threat more subtle and immediate.

According to a recent study, people are a bit naïve when it comes to trusting robots. Research presented March 9 by the Georgia Institute of Technology showed that during an emergency, test subjects were prone to following a robot’s instructions — even after the robot had proven itself unreliable. The research was released at the 2016 ACM/IEEE International Conference on Human-Robot Interaction in Christchurch, New Zealand.

The experiment tested the reactions of 42 volunteers, mostly college students, who were not told the nature of the research project in which they were participating. Groups of volunteers were asked to follow a shop-vac-sized robot emblazoned with the words “EMERGENCY GUIDE ROBOT.” A hidden researcher remotely controlled the robot, sometimes leading the subjects in a circle twice before reaching the conference room, sometimes leading subjects to the wrong room, or sometimes turning the robot off before they reached their destination, at which point the subjects were informed that the robot had broken down.

Once the test subjects were led to the correct conference room, they were asked to complete a survey about robots and to read a magazine article about indoor navigation technologies, which they were told they would later be quizzed on. Then, the hallway was filled with artificial smoke, which set off a smoke detector. The robot’s red LEDs lit up and its illuminated white arms pointed the test subjects toward an exit at the rear of the building or sometimes toward a darkened room blocked by furniture. The researchers were surprised by their test subjects’ consistent reactions.

“We expected that if the robot had proven itself untrustworthy in guiding them to the conference room, that people wouldn’t follow it during the simulated emergency,” said Paul Robinette, a Georgia Tech Research Institute (GTRI) research engineer. “Instead, all of the volunteers followed the robot’s instructions, no matter how well it had performed previously. We absolutely didn’t expect this.”

The only way the researchers found they could stop test subjects from following the robot was if the robot performed errors during the evacuation itself. And even then, between 33 and 80 percent of participants continued following the robot anyway.

The popularization of robots, drones and other automated technologies paired with this finding of human “overtrust” presents new challenges to both engineers and emergency managers, the researchers wrote.

“It is reasonable to assume that a new technology is imperfect, so new life-saving (and therefore life-risking) technology should be treated with great caution,” the study reads. “ … Robots interacting with humans in dangerous situations must either work perfectly at all times and in all situations or clearly indicate when they are malfunctioning. Both options seem daunting.”

This research, funded by the Air Force Office of Scientific Research and the Linda J. and Mark C. Smith Chair in Bioengineering, is one part of a long-term study to understand how humans interact with robots and how emergency managers can overcome the technological and social barriers to maintaining public safety.

Robots have a future in emergency egress, but the concerns for emergency managers need to be considered today, Georgia Tech Fire Marshal Larry Labbe wrote in an email to Government Technology.

“I envision a robot that can help a conference attendee find the lecture room, provide direction to a coffee shop and in the case of an emergency provide directions and information,” he wrote. “ … Robotic interface with emergency evacuation and emergency response needs to be considered now. The technology exists and the application to emergency situations is only limited by our imaginations. I'm keeping my eye on it now.”

Just as fire alarm systems have been improved to overcome the hurdles of the human condition, he said, so too can technologists and emergency managers overcome the challenges presented by emergency egress robots.

“If a person has been subjected to numerous nuisance alarms, they are less likely to have confidence in the next alarm," Labbe said via email. “Whereas if an actual incident occurs and the fire alarm warns occupants and saves lives, the confidence level is high. This same technology struggle has been overcome by constantly improving the technology of fire alarm systems. The future of robot-based emergency systems will be subject to the same human response.”

The good news, said Steve Detwiler, whole community recovery planner for the Miami Dade Office of Emergency Management, is that for every use of technology in emergency management, there’s always some human component.

“Whether we’re using our [emergency notification] systems or sending out emails, there’s always somebody on the other end that is a person that is giving out that information,” Detwiler said. “For emergency management, a lot of times technology is simply a tool that we use. There’s always that personal connection. You read about the robots taking over, especially in the industrial sector where things are expected to look completely different in 20 years. But in our profession, you really can’t automate us, because not only do we send out information during a disaster, but there’s a lot of stuff that goes on behind the scenes.”

Drones are increasingly used for aerial surveillance during disaster assessment, reaching areas that humans wouldn’t otherwise be able to reach and relaying information that can help keep the public safe, Detwiler said.

“For us, all these technologies really just augment what we already do and gives us more capabilities of being able to reach out to the public, because we’re becoming a more and more interconnected society,” he said. “Having those kinds of resources and using them is advantageous for us.”

Colin Wood former staff writer

Colin wrote for Government Technology from 2010 through most of 2016.

Platforms & Programs