Social distancing can't slow the spread of misinformation

University of Washington sociologist Emma Spiro reminds us that bad information is also contagious during COVID-19.

Emma Spiro on the University of Washington campus

Emma Spiro is an assistant professor and sociologist at the University of Washington and co-founder of the Center for an Informed Public. (Emma Spiro)

As the new coronavirus spreads from person to person across the United States, another contagious threat is also escalating: online misinformation. False stories range from speculation that COVID-19 is caused by 5G cellular networks, to claims that it’s a bioweapon, to the promotion of fake cures such as silver, zinc and teas.

Such inaccuracies are of particular interest to Emma Spiro, a sociologist at the University of Washington and co-founder of the new Center for an Informed Public, a research group focused on how misinformation proliferates. Falsehoods during a crisis are dangerous because they can mislead and confuse the public, something Spiro saw while researching the rumors that arose after the 2013 Boston Marathon bombing. And today’s digital connections appear to be supercharging the rumor mill — for example, multiple versions of the same story can be circulating online at the same time, making it difficult for people to distinguish fact from fiction.

Now, with her new center up and running and a small staff dedicated to this research, Spiro is closely monitoring the spread of misinformation during the current coronavirus pandemic and analyzing what factors influence its evolution. High Country News spoke with Spiro by phone in late March, as she stood outside her home on a once-busy street in Seattle.

This interview has been edited for length and clarity. 

 

How did you first become interested in studying misinformation?

I was pursuing my Ph.D. in sociology, and I was fascinated by the ways in which people — individuals and groups and communities — were starting to use online social networks and changing the way we participate in public discourse, and the ways we build social relationships with people.

At the time, people were just starting to use things like Twitter. So I started thinking about how these new technologies were being used, specifically in the crisis-response space. After a crisis event, we know that there are heightened levels of anxiety, extreme levels of uncertainty, as people are trying to figure out what is going on and what to do about it. And they often turn to their social networks, and they turn to their neighbors, and they turn to their family members. And now they were doing that online, so we could see how that was happening.
 

How has this shift to online platforms changed the spread of false information?

There’s a long history of studying crisis events and disasters in the social sciences, and often what we would see after events would be information voids, where there wasn’t a lot of information from official sources. As people collectively try to make sense of what's going on, that’s how rumors would arise — as a very natural byproduct of people talking and exchanging information. And we still see that today.

But I would say, [today] we often don’t have information voids. Now we have information overload. Everybody is on social media, trying to post things, trying to consume things. So there’s sometimes some confusion about what's official and what's not. The World Health Organization recently called this [the coronavirus] not only a pandemic, but an “info-demic” — when there’s an overabundance of information, some accurate, some not.

One of the things that we’ve seen just anecdotally … is the ways in which social media can sometimes allow nonexperts — potentially scientists, but with adjacent expertise, so not an epidemiologist [for example] — to become central players in some of these online conversations and really influence potentially a lot of people in terms of not only their beliefs, but also the actions they take. In this particular case, those actions can make a significant difference in the trajectory of how this [coronavirus epidemic] plays out.
 

What factors influence the circulation of these rumors?

Often that information really does try to elicit or arouse our emotions. And so, you know, it gets us worked up in particular ways, gets us very passionate about a particular topic. And that often leads to situations where people are making decisions that are not rationally thought out or intellectually thought out, but that are emotionally driven. Also, when we have multimedia content, we process that information in different ways. And people really respond to visual and video content.

Everybody is vulnerable. Information requires us to spread it, right? It doesn’t just read itself. This is a very participatory kind of phenomena that we’re trying to study.


What research is your team doing right now to better understand the spread of coronavirus misinformation?

This past December, we launched a new research center at the University of Washington called the Center for an Informed Public. I don’t think any of us anticipated [that] this is where we would be today, amidst the epicenter of a global pandemic, but we are.

Our mission is to be able to bring together the resources that we have at a world-class public university and address this challenge of … misinformation online, and how that diminishes our trust in democratic institutions — things like science and our government.


As a Seattle resident, what’s your personal experience in dealing with misinformation in a COVID-19 epicenter?

Everyone on our team always shares stories about interactions with our family and friends and people coming to us and saying, you know, “I heard so-and-so was going to bring the National Guard in and everything, and now it's going to be locked down.” And “I heard the hand sanitizer doesn’t actually work because it’s antibacterial.”


How do you respond when people tell you things that seem like obvious falsehoods?

I often will probe a little bit more. I’ll ask, “OK, well, where did that information come from? And how was it produced?” And try to trace a little bit of the provenance of that content — mostly out of curiosity, because I’m interested in where it came from and how people perceive it. But I think that also gets whomever I’m interacting with to ask those same kinds of questions.

This story was originally published at High Country News on April 7, 2020.

Please support independent local news for all.

We rely on donations from readers like you to sustain Crosscut's in-depth reporting on issues critical to the PNW.

Donate

About the Authors & Contributors