Is Data Really Anonymous?

By Kate Anastas
Thursday, May 14, 2020

The same technology that unlocks your cellphone could put patient privacy in jeopardy.

Participation in clinical research is critical for advancement of the medicine. When a patient consents to lend medical data to research, researchers must take certain precautions to maintain the volunteer’s privacy — such as removal of anything that may disclose a patient’s identity. However, a recent Mayo Clinic study found a person could be identified in a brain MRI image through use of facial recognition software.

Eighty-four volunteers who previously had a brain MRI were selected for the study. Brain MRI images and regular photographs were submitted for the software to compare. Researchers found that software correctly matched 83% of participants’ MRI images and personal pictures. The high success rate of facial recognition is concerning for healthcare institutions due to the possibility of breaching private health information.

The Solution

The team at Mayo Clinic plans to follow their study with an additional manuscript detailing improvements to privacy efforts. As use of artificial intelligence (AI) is challenging the healthcare system’s ability to keep information private, researchers must take additional steps to keep patient data confidential. Researchers typically use de-identification techniques and follow a strict data-sharing agreement, but in the digital age, this may not be enough. Although a complete resolution may not yet be in place to protect patients from advancing AI learning, institutions should recognize and consider the importance of continuing to revise clinical research privacy plans.

“This is only applicable if people can get access to the MRI scans in publicly available research databases,” says Christopher Schwarz, Ph.D., a Mayo Clinic researcher and computer scientist at the Center for Advanced Imaging Research, and the study’s lead author, in a news release. “It is not related to medical care, where data is secured.”