How Reddit’s coronavirus community became a destination
Emerson Boggs, 25, a Ph.D. student and virologist at the University of Pittsburgh, signed up on Jan. 23 to moderate what was then a small Reddit community with about 1,000 members dedicated to a relatively obscure topic: a new coronavirus that had been discovered in Wuhan, China.
Less than two months later, the Reddit message board /r/coronavirus has grown to more than 1.2 million members — almost a million of whom signed up in the last two weeks. Boggs is now one of a team of 60 volunteer content moderators, including researchers of infectious diseases, virologists, computer scientists, doctors and nurses, spending hours policing the more than 50,000 daily comments posted by the community for misinformation, trolls and off-topic political discussions.
Their expertise and unpaid labor have helped create one of the most authoritative, up-to-date and civil forums for information and discussion about the pandemic.
“No matter how much it makes my blood pressure rise, it helps me sleep at night knowing I at least tried to help,” said Boggs, who noted that the University of Pittsburgh is not affiliated with the subreddit.
The coronavirus community is now the third-most active subreddit, according to Redditlist, a website that tracks Reddit, and one of the fastest growing subreddits ever.
Every day, Boggs and the other moderators work through a queue of thousands of comments and posts that have been flagged for review. They coordinate via the messaging platform Discord to ensure they aren’t duplicating work or to settle any disagreements. Some spend time developing tools to automate or improve their workflow, inviting high-profile scientists and doctors to participate in “Ask Me Anything” Q&A sessions and recruiting more moderators.
The moderators play to their strengths. In Boggs’ case, that’s making sure posts submitted by users are scientifically accurate. This involves checking the sources of information and deleting posts if they rely on flimsy or poorly interpreted evidence and adding labels to posts linking to scientific papers that aren’t peer reviewed.
Until Monday, when she was sent home, Boggs squeezed this work, which she considers a civic duty, into breaks from analyzing the HIV virus with a laser scanning microscope in the Infectious Diseases and Microbiology Department at her university’s Graduate School of Public Health.
“The pace of the outbreak has really shown the deficiencies of traditional outbreak reporting,” she said. “Even working in virology, this subreddit is the most up-to-date source of information I am aware of.”
Computer scientist Rick Barber, 38, said he often spends 10 hours a day reviewing content and building custom tools for the subreddit, including one that notified moderators on Discord if the moderation queue on Reddit was getting particularly long.
“The news gets worse every day. This is probably having a big impact on my overall stress level and ability to pay attention to other work,” said Barber, who has a lung condition that makes him more susceptible to COVID-19, the disease caused by the coronavirus. “But at the same time, I feel like it’s worth it. I don’t know that we are saving the world, but I do know people are coming here and finding things they can send in a group message to someone who isn’t convinced they need to stay at home. Or finding some good news to show a friend who is in a state of despair about it.”
Barber, who previously worked in Silicon Valley as a data scientist for a health care analytics company and co-founded a venture capital firm, is now a Ph.D. student at the University of Illinois Urbana-Champaign researching conflicts between people and the online platforms they use.
“It’s like a startup without having to leave the house,” Barber said. “We’re recruiting people, interacting with a lot of users. We are all spending a lot of time doing this. But we don’t have to raise money. And I have way more intrinsic motivation to do this than any of the startups I was involved in.”
As the community has grown, the moderators have battled to curb the spread of misinformation and panic about the pandemic. They’ve done it by adhering to a list of principles, the highest priority of which is to be a reliable information source, Barber said.
One way they have done that is by banning political posts, said Patrick Doherty, 29, a moderator who has a master’s degree in biology from the University of Notre Dame and studied infectious diseases, including tuberculosis.
“When you don’t allow political posts, it’s harder for misinformation to spread,” said Doherty, who moderates the forum at night and during his lunch break. “I’d rather have policy discussion and talk about new cases and containment measures.”
That hasn’t stopped people from trying to turn the community into a “Twitter fight,” Barber said, referring to “folks who are habituated to using the internet to be hyper-tribal on the left and right.”
“Both sides are equally capable of being indignant,” he said.
Moderators also banned text-only posts in which people ask questions or make observations rather than share links to credible sources, even if posted with good intentions.
“People are scared, and if they think they have knowledge of something, it gives them a sense of control with something they don’t have control over,” Boggs said.
To mitigate racist and xenophobic posts that have emerged as the disease has spread among countries, the moderators have deployed automated tools to notify them when certain slurs or dog whistles are used. “There’s a lot of anti-Chinese racism,” Boggs said. That includes references to “bat soup,” which stems from the false reports that the virus originated from a Chinese influencer eating bat soup in a widely circulated video.
“I was expecting a very large influx of active disinformation and toxicity,” said Kat Lo, an online moderation researcher at the nonprofit Meedan, which builds tools to improve online information. “But it looks like the moderators have made people not feel empowered to be extremely racist or spread disinformation.”
Cristina López, a senior researcher at the nonprofit research institute Data and Society who specializes in disinformation online, said: “On other platforms, like Facebook or Twitter, the attention economy is usually ruled by whatever is new or catchy, which is why information that appears polarizing or information that is attention-grabbing usually gets more engagement.”
In contrast, the culture of the coronavirus subreddit provides incentives for users to share quality information, because they are rewarded with upvotes only if they follow the rules.
“So that in itself is a huge, huge help to getting the best information up top,” she said.