In response to the increasing spread of misinformation regarding the COVID-19 pandemic, UC Berkeley hosted a virtual panel of experts who spoke about how misinformation impacts people’s responses to the pandemic, as well as potential ways to combat this misinformation.
The event was held Dec. 8 and included various campus professors and a former campus research fellow. According to Deirdre Mulligan, a professor at the UC Berkeley School of Information, as the amount of information available about COVID-19 increases, there are bound to be inaccuracies that further polarize the United States.
“The social media companies had created the ingredients for the COVID misinformation and conspiratorial landscape we’re dealing with today,” said Hany Farid, a UC Berkeley School of Information and electrical engineering and computer sciences professor, during the event. “Our online information landscape is just a mess, and we need to start to get a handle on it.”
There are two reasons why this is the case with social media platforms. According to Farid, social media platforms have no basis for fact checking, and the algorithm automatically pushes false content to the front pages, which attracts more engagement than other content.
Nick Adams, founder of Goodly Labs, launched Public Editor, which is a tool that has brought some level of success in decreasing the amount of misinformation spread. Goodly Labs is a nonprofit organization that develops collaborative tools to engage with public data.
“It’s a community of citizen scientists who are using new, collaborative data science technology we’ve developed to evaluate the information within news stories,” Adams said during the panel. “People can begin now working together to identify and correct misinformation at the scale of the problem.”
The persisting problem is that there is a small chance of social media platforms rectifying their mass output of misinformation, according to Mulligan and Farid.
Social media platforms typically abstain from taking on anti-misinformation measures because it can often create less profitable opportunities, Farid added during the event. Farid noted that around the time of the Nov. 3 election, Facebook attempted to take on some of these measures and favored information from verified news sources, which caused marginal content to not be as relevant.
It is also important to consider the well-being of those who access information and provide a safeguard against the private companies that control the information they seek out, according to Mulligan.
“It’s really important for us to have some transparency around what is being removed so that we can understand and improve the technical tools that we rely on to improve some of the natural language models we use,” Mulligan said during the panel.