Study: Big Data Saves Lives, And Patient Safeguards Are Needed

healthcare ai
Print Friendly, PDF & Email

The use of big data to address the opioid epidemic in Massachusetts poses ethical concerns that could undermine its benefits without clear governance guidelines that protect and respect patients and society, a University of Massachusetts Amherst study concludes.

In research published in the open-access journal BMC Medical Ethics, Elizabeth Evans, associate professor in the School of Public Health and Health Sciences, sought to identify concerns and develop recommendations for the ethical handling of opioid use disorder (OUD) information stored in the Public Health Data Warehouse (PHD).

“Efforts informed by big data are saving lives, yielding significant benefits,” the paper states. “Uses of big data may also undermine public trust in government and cause other unintended harms.”

Maintained by the Massachusetts Department of Health, the PHD was established in 2015 as an unprecedented public health monitoring and research tool to link state government data sets and provide timely information to address health priorities, analyze trends and inform public policies. The initial focus was on the devastating opioid crisis.

“It’s an amazing resource for research and public health planning,” Evans says, “but with a lot of information being linked on about 98% of the population of Massachusetts, I realized that it could cause some ethical issues that have not really been considered.”

In 2019, Evans and a team of her students and staff interviewed and conducted focus groups with 39 big data stakeholders, including gatekeepers, researchers and patient advocates who were familiar with or interested in the PHD. They discussed the potential misuses of big data on opioids and how to create safeguards to ensure its ethical use.

“While most participants understood that big data were anonymized and bound by other safeguards designed to preclude individual-level harms, some nevertheless worried that these data could be used to deny health insurance claims or use of social welfare programs, jeopardize employment, threaten parental rights, or increase criminal justice surveillance, prosecution, and incarceration,” the study states. 

One significant shortcoming of the data is the limited measurement of opioid and other substance use itself. “This blind spot and other ones like it are baked into big data, which can contribute to biased results, unjustified conclusions and policy implications, and not enough attention paid to the upstream or contextual contributors to OUD,” says Evans, whose research focuses on how health care systems and public policies can better promote health and wellness among vulnerable and underserved populations. “We know that people have addiction for many years before they come to the attention of public institutions.”

A goal of the PHD is to improve health equity; however, “given data limitations, we do not examine or address conditions that enable the [opioid] epidemic, a problem that ultimately contributes to continued health disparities,” one focus group participant comments.

The study participants helped develop recommendations for ethical big data governance that would prioritize health equity, set topics and methods that are off-limits and recognize the data’s blind spots.

Shared data governance might include establishing community advisory boards, cultivating public trust by instituting safeguards and practicing transparency, and conducting engagement projects and media campaigns that communicate how the PHD serves the greater good.

Special consideration should be given to people with opioid use disorder, the study emphasizes. “When considering big data policies and procedures, it may be useful to view individuals with OUD as a population whose status warrants added protections to guard against potential harms,” the paper concludes. “It is also important to ensure that big data research mitigates vulnerabilities rather than creates or exacerbates them.”

Sign up for the free insideAI News newsletter.

Speak Your Mind

*