The two-year pilot project, if successful, could lead to products that will allow scientists to better understand data emanating from oceanic sensors so they can explore such issues as data discrepancies.
(TNS) -- Out at sea, thousands of sensors are constantly collecting data and reporting the information back to scientists, environmental managers and industry professionals on land.
A project in the works aims to improve the sensors’ performance — and perhaps improve the data it gathers, leading to advances in things like weather forecasting.
The sensors measure temperature, the water’s turbidity, salinity, air pressure, and other atmospheric and oceanic data that then fuel water and boating forecasts, aiding in ship navigation and helping research topics like climate change. The hitch, according to Felimon Gayanilo, a researcher at Gulf of Mexico Coastal Ocean Observing System in Texas, is scientists often don’t know anything about the sensors generating the data.
“We have very little or no idea about that sensor,” Gayanilo said. “Who manufactured it? When was it last validated? How was it validated?”
These questions introduce a margin of error into the research, as it creates uncertainty about the accuracy of the generated data. To solve this, Gayalino, in partnership with several other scientists from around the country, recently were awarded a $677,919 grant from the National Science Foundation (NSF) to create a database of information about the sensors.
“This pilot project, if successful, could lead to products that will allow scientists to better understand data emanating from these sensors so they can explore issues like data discrepancies or how current observations can be used in conjunction with historical records to conclude a statistical trend,” he said. “It should also help scientists, among others, figure out what could be causing differences in reports coming from neighboring sensors.”
In other words, he said, it will allow them to compare apples to apples.
During the two-year project, the team of scientists will create a way for people to access information about the sensors, called metadata. This new system would be easy to search and designed to merge with existing systems that both the public and private sectors use.
Additionally, Gayanilo and the team are hoping to work with private-sector companies that manufacture sensors to develop a better product.
“We want to come up with a way to improve the sensors ... to harvest some of the things we really need for science,” Gayalino said.
If all goes well, he said, the protocols from the pilot project could set the standard worldwide, with a better quality sensor and better metadata on the sensors. The research is focused on oceanic sensors, but the protocols eventually could be applied to land as well, Gayanilo said.
Scientists at Woods Hole Oceanographic Institution and the University of California, Santa Barbara, the Monterey Bay Aquarium Research Institute and Botts Innovative Research Inc. are all working on the research.
The pilot is one part of a larger initiative by the NSF called EarthCube, a push to create better digital support for geoscientists so that data can be assessed and shared more easily. The imitative is expected to last through 2022.
©2015 The News Herald (Panama City, Fla.) Distributed by Tribune Content Agency, LLC.
Looking for the latest gov tech news as it happens? Subscribe to GT newsletters.