Let’s reform how we communicate scientific findings

Author: Jessica Sieff

Walter Scheirer Feature

Walter Scheirer is an associate professor in the Department of Computer Science and Engineering and an expert in image forensics, computer vision and biometrics. Scheirer has worked to develop an early warning system using artificial intelligence to root out deepfake videos, manipulated images and political memes used to spread inaccurate information, threatening the integrity of democratic elections and inciting violence.

The pandemic highlighted the need for vigilance in regard to scientific integrity. Now, he’s working to build a similar system to look for evidence of image manipulation or inappropriate use of text in scientific papers — a problem that became glaringly urgent in the midst of the pandemic.

 

Thinking back over the past year, what would you say is your biggest takeaway?

The sheer magnitude of material appearing on the pandemic, which is basically unprecedented for this one specific topic — this deluge of scientific material out there is really dangerous for a number of reasons. There’s the disinformation angle, things that are obviously fake and meant to mislead the public — papers related to the pandemic that are being retracted in some cases for straight-up scientific misconduct, fabricated images and things of that nature.

There’s also, you know, the potential for political disinformation. Think, where did this information come from? Is it trying to mislead the public about science on political grounds? It’s extremely dangerous to do this in a public health context because there are ramifications. There are papers out there with an anti-vaccine flavor, trying to convince people vaccines are harmful. And clearly there’s some political angle to material of that nature, which is a serious issue.

 

What kind of impact has the pandemic had on your field? 

I think for the scientific community in general, the pandemic has heightened a need for literacy and awareness around the process of scientific research. There is research published in scientific journals that goes through a fairly rigorous and extensive peer review, and then there is research posted to preprint servers before being properly vetted. The issue is not necessarily that the preprint servers exist, or that research is uploaded to them. It’s that the general public, and even the media, can at times assume that this material is now established science, and that’s where you can run into a lot of trouble.

Part of this is related to academics and the perverse incentives of scientific publishing. Junior scientists feel that they need to maximize their publication numbers and can use preprint servers to upload material as soon as possible — which shows them to be productive scholars, while simultaneously working to get their research through peer review so it can actually be published and gain credibility. But again, it basically boils down to this idea within academics that you need to maximize your number of publications, and this can hurt science.

I see a lot of papers, especially with respect to the pandemic, coming out of computer science research projects to develop some sort of artificial intelligence algorithm that will identify salient markers of disease progression. There is no need for this. Nobody within the medical establishment is using such algorithms. They simply aren’t needed given the variety of more conventional means at the disposal of medical professionals. This is technology no doctor asks for, and what happens is that resources are diverted to those kinds of projects. We run the risk of misdirecting resources to a problem that is not a problem.

 

What would you say is most critical to think about for the future? 

I think scientists need to start reforming. Let’s reform how we communicate scientific findings and how we conduct our research. Incentives need to change. Those who control funding need to set stricter guidelines in terms of how that money is used and how findings are reported. It’s a general academic problem.