Many smartphone users take photos with their devices, upload the images to the Web, and share memories with friends and family in moments. It’s fast, easy and virtually instantaneous. But besides sharing with loved ones, what other functions can phone cameras serve, can they aid individuals in relaying public health information, and can that information in turn benefit the whole?

While it may sound a bit idealistic, researchers at the University of Southern California are hoping to take advantage of the growing number of everyday shutterbugs and tap into the power of crowdsourcing, while attaining crucial air quality data for one of the most polluted cities in the country. Announced Monday, Sept. 20, the USC Viterbi School of Engineering launched Visibility, an Android app that allows users to help combat pollution.

“Our goal is to develop an air visibility sensing system that uses off-the-shelf sensors and can be easily deployed to be used by a large number of people,” the app's three researchers wrote in a paper on the topic. “This will enable large-scale sensing of visibility and augment existing instrumentation that is precise, but expensive and sparse.”

For the smartphone user, the process is fairly simple: Open the app (which will soon be available for the iPhone) and take a picture of “all or mostly sky,” which is automatically relayed to a central computer at USC and analyzed for particulate matter, according to a USC press release. The picture – automatically tagged with location, orientation and time data – can be compared to established models of sky luminance to estimate visibility, which is directly related to the concentration of harmful “haze aerosols” that turn a sunlit, blue sky into drab grayness.

Because of smartphones’ capabilities – they have rich camera sensors, GPS systems, compasses and accelerometers, along with varying communication tools – correct app use takes little effort. But the end-user can’t just snap away and expect clean results, the researchers warned, as he or she may be asked to select a part of the image that is sky for clarification purposes.

And the app’s results appear to be up to snuff with government agencies. Air quality results already gathered in Phoenix and the Los Angeles basin compare favorably to data published by the U.S. Environmental Protection Agency, the researchers noted, and while this is promising, there is room for improvement. But before tweaks can be made, the app must be tested for one of its main purposes – crowdsourcing. 

Gaurav Sukhatme, USC Viterbi computer science professor, noted this will take some time, as Visibility was only launched Monday. “It was launched earlier this week so it’s hard to say if it’s popular (yet),” Sukhatme wrote in an e-mail.

USC Viterbi postdoctoral associate Sameera Poduri, stated in an e-mail that there have already been more than 250 app downloads in the first three days of its release, which is more than she expected. Poduri, along with Sukhatme, came up with the idea of crowdsourcing air quality measurements at a meeting last year; the app has been in development for more than a year.

Their group – the Robotic Embedded Systems Lab oratoryat USC – has for years worked on large-scale environmental sensing to study ocean algal blooms, forest canopies and other natural events by deploying large networks of static sensors, or “mobile robots,” Poduri wrote.

“To solve similar problems in urban settings, mobile phones seem like the ideal platform because of their sensing and computation abilities and the fact that they are already deployed in very large numbers," Poduri said. “Being in L.A., air quality is something we are always worried about, so we decided to take that on first.”

The system has the potential to fill in many air pollution mapping blanks in California and provide another layer of information atop the conventional air pollution monitors, Sukhatme noted in the release.

“While monitoring air visibility is important for our health as well as the environment, current monitoring stations are very sparsely deployed,” the paper states. The researchers went on to note that visibility is typically measured using human observers, optical instruments or chemical sensors, and while the human method invariably suffers due to subjectivity, the more precise optical and chemical measurements are expensive and require routine maintenance.

While this is a new system and challenges are a certainty, Poduri said the research team expects challenges and solutions are already being hashed out. One such challenge is the amount of data USC will receive from “a wide variety of phones, each with its different camera hardware,” which will make image processing laborious.

Also, the team wants to ensure the app’s design is user-friendly and “simple and engaging enough that users feel interested to gather data,” Poduri stated. “This is something we hope to understand from the response to our current app.”

And as with any crowdsourcing app, privacy is a huge concern for users. While the app’s functionality depends on critical factors like location and time, the researchers found a way to keep the information from leaving the phone, while still reaching USC, Poduri stated.

Karen Wilkinson  |  Staff Writer