Notre Dame Stories: Social Media, Misinformation, and You

Author: Liz Harter

Social media apps on an iPhone.

It’s one of the biggest news stories of the year: social media and the spread of misinformation. While Facebook garnered much attention over the past several weeks, the problem of misinformation goes back far longer and is far broader than many people realize.

Tim Weninger, the Frank M. Friemann Associate Professor of Engineering at Notre Dame, has been studying the spread of misinformation on social media for more than a decade.

He said he’s been “studying misinformation since before it was cool” and started researching the dawn of the Islamic State group to see how it was able to recruit so well. The answer? Coordinated efforts.

“There were entire buildings of people who weren’t fighters, they were social media information warriors,” he said of the Syria-based teams who created well-produced, compelling content inviting others to “become part of a brotherhood of fighters for this cause.”

The advertisements weren’t enough, though. Weninger said what was really interesting is that they also had large teams coordinating to like and share the content being produced to spread the message further.

“If you have a handful of people working together, you can really drive a message if you do it in the right way.”

The “right way” to get a message out is not always the right thing to do for society, though.

Weninger said that the spread of misinformation is both better and worse now. It’s better because researchers are studying this phenomenon and the social media platforms are aware and looking into it, too. 

“One of the most important things that social media companies are starting to realize is that they have an important role to play in civilized discourse, and they didn’t think they had that before,” he said. “Now they know they do have that responsibility.”

It’s worse, though, because the barrier to entry to be shady on the internet is very low and a lot more players are getting into the game, which can spread more misinformation much farther.

He shared an example of an article about Brexit written by a fake professor from a university that doesn’t exist. The authors paid for a couple hundred likes on the false article and it made it to the front page of a social media site, spurred discussion and trended for 12-14 hours.

“It’s fascinating to see that it doesn’t take a whole lot of effort — $200 to drive this message and start a conversation based upon something completely made up,” Weninger said.

In hindsight, it can be comical to see things like that happen, but the implications of it are not actually funny. But Weninger said that these bots or paid likes aren’t the biggest problem. It’s real people.

“You just need a bot to get it into the conversation, and then real people do the rest of the work,” he said. “We, the well-meaning individuals, do most of the bad work. And the reason is because we don’t read before we share.”

He said to watch out for “pink slime journalism” or digital “news” sites set up in smaller towns adjacent to larger populations that masquerade as legitimate news sources by using similar logos and URLs. These fake news sites primarily share fictional stories of vigilante justice, politics or responses to the COVID-19 pandemic.

“Pick your topic of the day and you can make fake headlines out of this, and then because people don’t actually read, they just see the headline, these stories spread on social media like wildfire,” Weninger said. “And [the stories are] fake and made up, and by sources that look legit but don’t actually exist.

Similar things are happening with video, too. Weninger mentioned both deep and shallow fakes using an example of a video he saw of a sporting event where fans in the stands supposedly began chanting expletives about President Joe Biden.

“I was at that game, I was in the stands. That didn’t happen. There was no chant,” he said. “But what ends up happening is that people with political agendas can very easily take a chant that might have happened somewhere else and superimpose the audio onto a video of the camera panning across the crowd. … That is ridiculously simple to do. That’s a shallow thing, that’s just someone with five minutes of time.”

Sometimes, like a chant at a sporting event, the shallow fake is difficult to validate because it’s hard to find a primary source or corroborating videos. Other times, content like memes or clearly edited images make spotting a fake much easier. The problem with these fakes comes, once again, when there are coordinated efforts to drive conversation that could potentially be nefarious.

With his latest research, Weninger and others at Notre Dame are working to combat coordinated efforts to spread misinformation through media forensics. 

Media forensics uses the most advanced artificial intelligence technology to try to wade through the deluge of online images, media and video  and look for coordinated campaigns.

“We can take a look at a picture or a video or an audio stream and we can determine not only has this been faked? Has this been altered or slowed down or spliced or cropped or whatever?” he said. “But we can also say who did it. We can say that this was done by people using what software, and usually we can say from what region of the world.”

Weninger wanted to be clear, though, that media forensics is not intended to silence the free speech of individuals, but rather will look at coordinated efforts of organizations or other countries. That, he says, could constitute “modern warfare.”

“This information operations, influence operations, is how the hearts and minds will be won in the next several decades,” he said. “And right now democracy and democratic countries are having a hard time fighting that, because we’re vulnerable to those types of things because we’re so free. And our freedom provides vulnerabilities. And so we’re trying to create these tools to level the playing field to say we’re able to call out when these countries are behaving badly.”

So how can a person make sure they contribute to the solution instead of the problem? Weninger recommended slowing down and being intentional. That means actually reading a post before liking, commenting or sharing. 

“It’s important to realize that we, collectively, and our neighbors, and our friends, and our family are all the editors of our friends’ news feeds,” Weninger said. “That’s a responsibility that we didn’t know that we had. And it’s a responsibility that we need to take seriously.

Read more about Weninger's work here.


Notre Dame Stories highlights the work and knowledge of the University's faculty and students. This podcast features interviews with Notre Dame faculty members who can lend insight into some of the major national and international stories of the day, as well as pieces that show the breadth of the life and research at the University.

Listen to more episodes here.