Social media has a velocity problem, according to Department of Scandinavian professor Timothy Tangherlini, who works on culture analytics. Many platforms work to feed people the content that will yield the most engagement in order to keep people scrolling. But the speed and precision at which people are targeted has amplified polarization and outrage.
“(There are) ways of profiling user groups through metadata and trails left on the internet,” Tangherlini said. “The speed at which this can happen is astonishing. We can get a message out quickly, we can amplify it and direct it. Those things are very different from low level face-to-face interactions that tended to characterize cultural interactions prior to the rise of social media.”
For example, when Disneyland emerged as “ground zero” in 2014 for a measles epidemic, as noted in a Centers for Disease Control blog post — a result of parents not vaccinating their children — Tangerhlini noted that public health officials were shocked.
But as a computational folklorist immersed in the intersection between computer algorithms and storytelling, he wasn’t surprised. He’d been tracking “mommy blogs” for the past nine years, where parents expressed vaccine hesitancy.
“You might reject it out of hand, but next time you hear about vaccines, you’ve been primed to recognize that some people are worried about vaccines,” Tangherlini said. “The more exposure you get to this type of storytelling, even though you might be the most pro-science person around, you now have incorporated (the idea) into your belief landscape.”
He alleged that belief landscapes are “easy” to fall into when it seems like a community of people hold the same beliefs and values.
It’s often how people fall into conspiracy theories or disinformation campaigns: all the individual narrative units of Pizzagate — whether they involve democratic politics, casual dining, the Podesta brothers or cannibalistic satanism — don’t make sense and fall apart when prodded, according to Tangherlini. But all together, with an “inventive reading,” it can morph itself back into a story.
Some people tell stories about themselves, their communities or the world. Some discuss politics. On Twitter, some say Tupac is still alive and living in Cuba, or that there are zombies in China, or that Trump was chased on foot by New York police, according to a number of campus students.
And while they may be easily ignored off the bat — junior Kabitsane Mphenyeke knows that Tupac is definitely not alive, freshman Brooke Day knows that there aren’t zombies in China and sophomore Nikita Biswas did their research and realized the Trump photo was AI generated — seeing content means it can never be unseen.
Even interacting with a post for a little longer can drive the content people see — sophomore Marisol Morales noted that when she watched a conservative YouTube video, she continually got more and more on her recommended page until she intentionally watched liberal-leaning videos to balance out her feed.
“Wow, is this what Andrew Tate fans feel like?” Morales mused. “Like no disrespect but I saw videos of him and I was like, that’s too much of him. Is this how he creates his little groups, like his families? It just seemed like videos over and over, and if you only see it from one point of view, it makes so much sense.”
Subramaniam Vincent, the director of journalism and media ethics at the Markkula Center for Applied Ethics at Santa Clara University, also noted that people who feel isolated or excluded may be more inclined to join online communities, and vulnerable to conspiracies.
On social media, where conspiracy theories are easily peddled by online echo-chambers, deceptive actors and robots, it’s hard to know who’s telling stories, and if they’re even true, said Tangherlini.
The public has a collective “brain trust” of shared knowledge, and knowing that an opinion is widely shared can galvanize people into action, according to Jeremy Rue, an adjunct associate professor in the UC Berkeley Graduate school of Journalism. But it’s hard to know if something is a common sentiment, or just a few people with widespread influence.
Vincent noted that the design of social media platforms, which allow for quick, low-investment responses, gauge interaction in a “blank emotional space” and play a key role in the amplification of content. Facebook allows users to react with just an emoji, and a simple repost is common across nearly all platforms.
In addition, these platforms tailor their content by tracking things as slight as how long someone stays on a video, Morales noted.
“Usually, a combination of thoughts and feelings lead to authenticity in your approach to reacting to something, but the design of expressing feelings quickly and using that for algorithmic signals for what’s engaging has led to more rapid thinking online,” Vincent said.
When people are able to rapidly react with their feelings, their ability to self regulate lessens, Vincent noted. Thus, more inflammatory content garners more reactions leading to more clicks.
Tangherlini points to the January 6 riots at the U.S. Capitol as an example of how while social media has real world effects, many people realized that they didn’t actually agree with the people around them, which may have mitigated some potential violence.
“We are what we believe,” Tangherlini said. “We tell stories about ourselves, community and things we think are happening in the world. As we internalize and share them, my only hope is you aren’t only just sharing these online, but you’re sharing them with friends in low level face-to-face interactions where there’s an opportunity for social breaks and correction.”