IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Researchers: Data Privacy in Health Care Needs Boost

Health-care professionals and lawmakers should be more concerned about patients’ data outside the clinical setting, said Jessica Golbus, a medicine fellow at the Michigan Medicine Frankel Cardiovascular Center.

data breach_shutterstock_236404207
Shutterstock
(TNS) — A user may not think twice when sharing exercise stats on an app with their trainer, but several University of Michigan researchers warn health privacy is affected as technology advances.

Healthcare professionals and lawmakers should be more concerned than ever about the uses of patients’ data outside the clinical setting, Jessica Golbus, a cardiovascular medicine fellow at the Michigan Medicine Frankel Cardiovascular Center, said.

Golbus and two colleagues authored a perspective “Privacy Gaps for Digital Cardiology Data: Big Problems With Big Data," in the medical journal “Circulation,” which will publish on Tuesday.

In it, Golbus and her colleagues argue there is a greater role of physician’s responsibilities to educate their patients on privacy when recommending technology, and a greater role for legislation that could protect data use.

The article came from conversations about digital interventions in health care. As more research began to show commercial interest in health-generated data, the team wondered about the implications. Of 24 prominent, medically related apps found that 19 shared user data with 55 unique entities, according to a 2019 study cited in the perspective.

Golbus said part of the issue is apparent in the narrow scope of the Health Information and Privacy Protection Act, or HIPAA. The U.S. health-data privacy law regulates data between healthcare professionals and patients, but not un-identified data.

Data without a name can still create a picture of someone’s health, Golbus said.

“For example, somebody that shared a story on a website about depression -- that may be clear that person has depression,” she said. "But you could also gain information from people searching for light therapy lamps and then if that person likes posts on a website from other patients of depression, these online sources may be able to combine or triangulate this information make inferences about health, in this case perhaps depression.”

Even voluntarily shared data can face similar privacy concerns. If a patient is a Bluetooth-connected blood pressure cuff and an app to share the data, such as their heart rate or blood pressure, with their physician, that information is protected. However, if that patient shares that data with their fitness trainer or nutritionist, the data is unprotected. That data can be used in combination with other analytics to piece together more information about a person.

Even actions outside of health apps could be used in relation to health data. Processes like geofencing, or using smartphone-based location to trigger a digital response, could indicate how often a user visits a fast-food restaurant. That data can be used then to make predictions.

Third-parties may then use that information for sales, loan eligibility or create a health score to make decisions about life insurance or employment, she said.

But that doesn’t mean doctors should shy away from mobile apps and other data-centered patient products, Golbus said. Not all health correlations are covered in non-discrimination laws.

“We’re really excited about the types of the things that we can do with this data to help our patients and deliver targeted interventions,” she said. "We really are of the belief that all of this data we’re generating is ultimately for good and to help our patients. We also want to make sure we’re thoughtful while we’re doing this and that our patients are educated.”

The team argues that once clinicians educate themselves on the topic, they will be able to have frank conversations about data privacy. It’s too big of a burden to expect users to read lengthy, often technical privacy policies, Golbus said.

The authors also argued broader privacy laws would be better in the long run. They pointed to the European Union’s General Data Protection Regulation passed in 2016 as an example. That law requires data processors and controllers to provide users with their own data, clearly disclose data collection, set high-privacy defaults and more.

“Although we do not regard the General Data Protection Regulation as a panacea, the United States needs law that governs data, not just a limited number of covered entities that hold it, and embraces a broad and inclusive definition of health-related data, without the artificial distinctions inherent in HIPAA,” the article states.

©2020 MLive.com, Walker, Mich. Distributed by Tribune Content Agency, LLC.