The project, called DECADE, was described by its developers as “a dataset of ego-centric videos from a dog’s perspective.” The dog in this case was an Alaskan Malamute by the name of Kelp M. Redmon, who was equipped with a GoPro on her head and an array of motion trackers on her body. The system was designed to take as many readings as possible without hampering the dog’s abilities or preventing her from behaving normally.

The setup took 20 readings per second, which were then synchronized through a computer system. The data set compiled through the study was used to successfully train off-the-shelf deep learning algorithms to predict a dog’s reaction in different scenarios.

“The near-term application would be to model the behavior of the dog and try to make an actual robot dog using this data,” said Kiana Ehsani, University of Washington, Seattle, computer science Ph.D. student. The study was conducted by Ehsani and others at the University of Washington along with the Allen Institute for AI. A paper detailing the research, which received funds from the Office of Naval Research and the U.S. National Science Foundation, is available here.