时间:2025-06-17 19:06:22 来源:网络整理编辑:休閑
In the days after the election, apoplectic progressive journalists spent their time writing boiling
In the days after the election, apoplectic progressive journalists spent their time writing boiling hot takes, trying to find the one CNN chyron or Nate Silver tweet responsible for handing democracy over to a Putin-loving creamsicle. And while no one could ever agree (or admit that they agree) on the real enemy, nearly everyone pointed a finger at the new guy in town: fake news.
Almost instantly, Chrome extensions appeared that made it easier for users to identify fake news. Last week, Facebook even rolled out some far-too-cautious tools to help stop the onslaught. Yet for all of their efficacy, none of these tools will be able to fully curtail the plague of propaganda.
But every tech solution rolled out so far lacks the crucial ingredient necessary to make them work: human contact.
Ugh.
It's impossible to overstate the role fake news -- or propaganda, as seems increasingly likely -- had in this election. A Buzzfeedpost-election analysis found that fake news stories significantly outperformed real news stories in the final three months leading to the election. The top 20 best-performing sites generated 8,711,000 shares, likes and reactions, compared to just 7.3 million from reputable news sources.
And while both liberals and conservatives shared fake news, Trump supporters, it seems, were particularly susceptible to it: 38% of fake news shared came from the conservative sites, compared to just 20% from liberal sites.
Zuckerberg initially responded to criticism with outright denial, calling the idea that fake news and Facebook influenced the election "pretty crazy." So concerned journalists and software developers stepped in where the platforms didn't. Fake News Alert and B.S. Detector, developed shortly after the election, are both Google Chrome extensions that alert users when the site they're visiting is highly biased, simply clickbait or pure propaganda. Slatealso released a Google Chrome extension simply called This is Fake, which helps users identify and report fake news on Facebook.
Facebook itself released their own set of tools last week that make it easier for users to flag fake news, hopefully making it harder for these stories to spread. Content determined to be false by their bipartisan fact-checking partner Poynter will come with a warning label as well as an explanation. Facebook has also said they'll prevent these stories from being advertised. Let's hope.
As powerful as each of these tools may be, none of them will likely go far enough to stop the torrent of fake news -- though they may temper it -- because they all fail to realize "the human element," and subsequently rely on two false premises:
1. The idea that people of different political persuasions are still talking to each other on social media, and therefore capable of spotting and reporting fake news.
2. The belief that people -- the same people who renamed the CNN "The Clinton News Network" and screamed "Lugenpresse," an old Nazi term, at the press -- will not project the same hostility towards Facebook, soften their criticism of The Washington Post or download a Chrome extension from ultra-progressive Slate. (Breitbart is already railing against Facebook's anti-fake news initiatives.)
This Facebook trending story is 100% made up.
— Ben Collins (@oneunderscore__) November 14, 2016
Nothing in it is true.
This post of it alone has 10k shares in the last six hours. pic.twitter.com/UpgNtMo3xZ
In order for Facebook's new tools to properly work, for example, users must first identify suspicious-looking content. But all readers, conservative and progressive alike, are inherently biased towards content that reflects their pre-existing political beliefs and values.
Facebook's algorithms fill your News Feed with familiar faces, who are more likely to share stories you like, limiting information diversity and creating echo chambers. In the months surrounding the election, Facebook users unfollowed and sometimes purged users from their feed whose political views didn't align with their own.
So it's strange to imagine why most Facebook users would even be confronted with content they didn't like (and doesn't appeal to their political values) in the first place. If the story fits, people tend to wear it. It's remains to be seen if liberals or conservatives -- who both exist in social media echo chambers -- will be able to identify fake news stories that nonetheless appeal to their political values, and still be able to report it.
Why would users report stories that look potentially dubious (a skill, researchers found, many readers just don't have) if those stories neatly correspond with their ways of seeing the world?
There's also the inherent danger -- though perhaps an unavoidable one -- that the Infowars, AddictingInfo and Breitbartreaders of this world will soon come to distrust Facebook's fact-checking services. Why would the people who regularly read Alex Jones -- who claimed that Hillary was in league with the devil -- or who believe that Comet Ping Pong Pizzeria was home to a child sex ring led by John Podesta suddenly trust Facebook's judgement on The Washington Post? Facebook has been accused of being liberal before, so it's unclear whether voters paranoid about the mainstream media won't just become even more suspicious of the platform.
Obviously, there are more tools than Facebook's measures that people can use to call out fake news. But even those mechanisms are partisan, and the people who use them, self-selected. Breitbartfans aren't exactly going to go to progressive Slate and download their hottest fact-checking tool. People who spent the months preceding the election actively sharing "Denzel Washington Backs Trump In the Most Epic Way Possible" probably aren't going to read that Mashablearticle listing the smartest new chrome extensions for spotting fake news. Those who live and die by The Daily Calleraren't suddenly going to find room in the hearts for little ol' bipartisan Snopes.
In our (almost) post-fact world, we've come dangerously close to post-fact-checking. And that -- perhaps more than anything else that happened this past month -- should scare you.
None of this is to say that these tools won't be effective, or aren't deeply important. Not every Facebook user or Trump supporter is an Infowarsreader, and there were surely be many users who will trust Facebook's judgement and subsequently learn how to become better, more critical consumers. (Facebook could also ban some of the more egregious fake news accounts in the first place, or take for more aggressive measures to stop them).
Propaganda works, and as we've seen this election, can do lasting, potentially lethal damage.
But if voters, particularly progressive voters, are serious about spotting and stopping fake news, they're going to need to commit a truly painful act -- and actually communicate with the people who believe these stories.
The main reason people believe in fake news, researchers found, is simple: because they want to.
The main reason people believe in fake news is simple: because they want to.
Stefan Pfattheicher, a professor at Ulm University, recently told The Washington Postthat people believe in fake news not due to a lack of intelligence, but a lack of will.
"This seems to be more a matter of motivationto process information (or news) in a critical, reflective thinking style than the ability to do so," Pfattheicher said.
If "critical readers" have any hope of stopping the fake news explosion, they'll need to do more than rely on external fact-checking tools or New York Times hyperlinks. They'll need to keep people in their Facebook feeds who they disagree with, and try to have radically empathetic, compassionate conversations with them off of the Internet.
The goal shouldn't end at halting the speed of a news story about "How all Muslims are terrorists" but at curbing people's desire to share that story, or believe in that hate, in the first place. And that means talking to people who disagree with you, people whose politics and values violently clash with your own.
Who wants to do that? No one, of course. (Raises hand.) It's awful, frequently traumatizing and, for anyone who's ever confronted an egg avatar on Twitter, often a waste of time.
Real change happens, however, when core beliefs change too. Since the election, organizers have shared tools for people to use to try and "convince" other people that their world views are distorted, even dangerous. Key to the design of every one of these tools is moving beyond factsand into the realm of the personal. For many, facts have become too partisan. Emotion, first-person stories and relationship-building often change more hearts and minds than a Politifact link or Facebook debunk.
Fake news is here to stay, and platforms, software developers and people will need to imagine even more aggressive ways to kill it. Tools will help. So will Chrome extensions. But the only real way to help people to believe in facts again is to magically, somehow, go beyond them and have a conversation.
TopicsFacebookElections
Dog elected for third term as mayor of Minnesota town2025-06-17 18:40
中超第17輪裁判選派結果:張雷執法上海海港VS泰山2025-06-17 18:26
張琳芃蔣光太複出廣州隊力爭首勝 鄭智有需要或將登場2025-06-17 18:19
李金羽:贏球的比分不重要 球隊離完成目標還有一定距離2025-06-17 17:47
This chart shows just how high Simone Biles can jump2025-06-17 17:10
國安通報王剛傷情:右臂橈骨前端骨折 將在廣州進行手術2025-06-17 17:09
馬德興 :中超複賽無關乎資本 剩下的才是真正熱愛之人2025-06-17 17:05
中超前瞻 :海港山東決戰天王山 鄭智力爭執教首勝2025-06-17 16:55
Fiji wins first2025-06-17 16:50
申花主帥:阿德裏安因回避條款無法出戰 會合理調整陣容2025-06-17 16:33
Honda's all2025-06-17 18:56
蔣光太:雖不是預料中的結果但大家盡力了 很高興可以重回賽場2025-06-17 18:36
搜狐體育專訪傅亞雨:俱樂部做青訓應麵向教育係統2025-06-17 18:24
海港vs廣州城首發:奧斯卡保利尼奧先發 黃政宇出戰2025-06-17 17:59
Pole vaulter claims his penis is not to blame2025-06-17 17:47
京魯大戰裁判被撞倒 坐在地上吹停比賽(gif)2025-06-17 17:33
彈跳力滿分!張琳芃傳中韋世豪頭槌破門 廣州隊12025-06-17 17:19
張琳芃蔣光太複出廣州隊力爭首勝 鄭智有需要或將登場2025-06-17 17:15
MashReads Podcast: What makes a good summer read?2025-06-17 17:06
尤文前瞻:斑馬軍團必勝之戰 鋒線傷員多進攻堪憂2025-06-17 16:33