What is “fake news”? Where does it come from, and who spreads it? And what impact does it have on democracy?
Those were the questions researchers at CU-Boulder tried to answer in a recent study of how “fake news” spreads.
It’s a timely study: The president is in a spat with Twitter over “censorship” after it attached a fact-check to one of his tweets. Facebook, for the last four years, has faced calls to do more to curb the spread of misinformation after the 2016 election interference debacle.
The cultural climate that fostered the rise of “fake news” — the researchers prefer the term countermedia to refer to misleading content, often promoted as news — is the result of several factors, says lead researcher Toby Hopp: mistrust in “mainstream media,” mistrust in those that have power and the decline of information’s historic gatekeepers, newspapers.
“There are fewer news organizations now than there have been for quite a while. The ones that do exist are understaffed, and we feel these effects on the local level,” Hopp says. “You’ve also seen the nationalization of politics. News organizations increasingly focus on large political issues in a polarized country, which allows for certain types of actors to pursue political gain. … When you factor in social media, it is perhaps historically ripe for various forms of low-quality news-like info, ‘fake news,’ misinformation, disinformation and hyper-partisanship [to flourish].”
Facebook is to blame, too. Hopp suggests because Facebook is not transparent about its decisions to place advertising, users don’t know if they’re being pushed toward content. And because Facebook has avoided responsibility for these ads, it lacks a set of standards that users (and regulators) can use to make sense of all the content it hosts.
“Facebook doesn’t have an ideology,” Hopp says. “They do whatever they think the federal government wants them to do at any given moment in time. If I’m going to blame one actor for perpetuating the spread of ‘fake news’ — and of course there are many actors and players that share some of the blame — if I’m going to place the most blame, it’s going to be Facebook.”
Facebook is the primary driver to countermedia content, Hopp says. Of the people the study analyzed, 29% of people shared countermedia on Facebook, compared to only 5% on Twitter. Most of that misinformation came from the politically extreme — both conservative and liberal. And older people were more likely to share countermedia than younger folks.
Regardless, the type of false content users of all stripes and ages share isn’t always necessarily “false,” Hopp says — “it omits detail. It uses hyperbolic language. It tries to elicit negative emotional reactions. It uses euphemisms. It presents a polarized view of reality.”
That confuses things for the user, and is one reason why countermedia is so effective. But it also calls into question our responsibility as individuals to snuff out misinformation and curb its spread.
Truth is, we’ve been spreading half-truths and politically convenient arguments for a long time. Now that social media has accelerated how quickly we do that, unique problems have emerged.
“People have always lied and told various versions of the truth in order to suit their ends. This is not a huge [change in] human behavior,” Hopp says. “But when you start to look at the scale of social media and all the other challenges American democracy is facing right now… if we’re talking about salacious ‘fake news,’ we’re not talking public policy, we’re not talking governance and structural reform. We’re talking about this made-up story that was made up in a way to provoke emotionality.”
So, here’s the million-dollar question: How do we get out of this? We’re more aware now that foreign actors will use social media to shape views and sow discord — though Hopp says it’s “hard to pin down” if that was the reason Trump won in 2016, and there’s little evidence the current administration has any interest in effectively quelling such interference this time around. But the countermedia that’s being circulated as we speak is largely a domestic product, and so figuring out a solution that doesn’t trample First Amendment rights to free political speech is tricky.
“Like any of the problems we face in a democracy, the answer is complicated,” Hopp says. “Reputable news organizations cannot cede the digital spaces to these countermedia or ‘fake news’ websites, as frustrating as it is for journalists and news organizations to be on Facebook. I don’t think we can give up trying to form an objective version of the truth.”
Hopp also advocated for “social pressure” to halt the flow of countermedia by encouraging others to fact-check media, and even for organizations to stage “interventions” to curb the tide of misinformation when it’s abundant. Outside of that, regulation — maybe borrowed from countries that are doing it responsibly — “might be helpful in stemming some of this ‘fake news,’” Hopp says.