时间:2025-08-02 03:21:39 来源:网络整理编辑:綜合
Facebook made nearly $27 billion last year, but the tech giant can't seem to figure out how to fix i
Facebook made nearly $27 billion last year, but the tech giant can't seem to figure out how to fix its fake news problem on its own.
Their solution: enlist a nonprofit that has successfully done so—with the help of 133,540 moderators.
In its latest move to prove it's no longer a threat to democracy, Facebook is tapping Wikipedia. Now, when Facebook users see articles on News Feed, they can click on a little "i" button and see the Wikipedia description of the publisher. They also will see a button to follow the Page and see trending or related articles.
Facebook previously balked at the notion of using human editors to oversee the flow of news and information across its network, instead relying on algorithms to handle everything from publisher posts to advertisements. Now, it's slowly starting to embrace the human touch, hiring editors and now even partnering with Wikipedia and its army of moderators.
SEE ALSO:In a crisis, Facebook's far from perfect—but still essentialWhy not just show the description on the publishers' Facebook Page? Perhaps because Facebook can't trust that. Publishers could have such a space and fill it with terms like "Jew haters" or other malicious words. Of course, publishers could also do that on Wikipedia, but Wikipedia has managed to monitor for and prevent such abuse.
Remember when teachers told you not to trust anything you read on Wikipedia? Well, according to Facebook, you now should. But you might not want to trust everything you read on Facebook. Wikipedia is not perfect, but it's been able to curb a fake news problem by enlisting volunteers—real, living people—to moderate pages while incorporating other algorithmic systems.
Wikipedia has long taken the fake news problem seriously. In its early days, anyone could create and get away with making pages with false information. People took advantage of this. But as Wikipedia grew, it created a community of moderators that helped prevent those misleading pages and errors from staying up. English Wikipedia now has 133,540 editors who have edited a page in the last 30 days, according to a Wikipedia page.
Facebook seemed to only wake up to the problem of fake news after the 2016 election. In the aftermath of Donald Trump's victory, CEO Mark Zuckerberg said Facebook influencing the election was a "pretty crazy idea" and said fake news was only a small portion of posts on the site.
While that scale may be true, Zuckerberg has since apologized for his words, and his company has been working to release solutions to prevent fake news from spreading on the platform. Facebook launched the Facebook Journalism Project, and it recruited third-party fact-checking organizations to help monitor for fake news.
But early reports show that it may be more of a pony show than a serious initiative, and we may have Facebook to blame. Fact-checkers enlisted by Facebook told Politicolast month their efforts have been harmed because Facebook refuses to share information. Fact-checkers are unable to see if the "disputed" tags they add to articles actually have an effect and are not able to prioritize stories.
Facebook said its efforts are working. “We have seen data that, when a story is flagged by a third party fact-checker, it reduces the likelihood that somebody will share that story,” Sara Su, a product manager on Facebook’s News Feed team, told Politico. But she declined to share data proving the point.
Meanwhile, Wikipedia founder Jimmy Wales has been quite outspoken about the fake news problem.
"I would say just in the last couple of years, I feel like things have gotten much worse in terms of clickbait headlines, fake news," Wales told Mashableearlier this year. "People will contribute if they're asked, and if they're protected from trolls."
In fact, Wales is taking the fake news problem so seriously that he launched a new company. Earlier this year, he announced WikiTribune, an online news site with articles reported and written by professional journalists. The site will also have volunteer researchers and fact-checkers.
So, yes, Facebook is a profitable company with a glaring fake news problem led by someone who once brushed off the severity of the issue. Wikipedia is a nonprofit that enlisted volunteers years ago to battle misinformation and is led by someone who wants to do more to help.
TopicsFacebookSocial MediaElections
Sound the alarms: Simone Biles finally met Zac Efron2025-08-02 03:14
津門虎外援卡達爾即將到天津 體能教練給他開小灶2025-08-02 03:01
十佳球:薩拉赫1V4破門 曼聯妖鋒穿雲箭 萊萬蠍子擺尾2025-08-02 02:36
C羅僅1射正進攻端碌碌無為 馬奎爾多次失誤釀丟球2025-08-02 02:34
New Zealand designer's photo series celebrates the elegance of aging2025-08-02 02:32
尤文從泥潭走出狀態回勇 距榜首10分奪冠任重道遠2025-08-02 02:32
尤文從泥潭走出狀態回勇 距榜首10分奪冠任重道遠2025-08-02 02:10
利物浦VS沃特福德首發 :紅箭三俠出戰 阿利森缺陣2025-08-02 01:47
Whyd voice2025-08-02 00:53
隊史首次 !拜仁半場已5球領先藥廠 火力全開創紀錄2025-08-02 00:38
Katy Perry talks 'Rise,' her next batch of songs, and how to survive Twitter2025-08-02 03:03
伊卡爾迪被曝出軌阿根廷女演員 旺達取關並怒斥2025-08-02 02:33
紐卡官方表示暈倒球迷情況穩定 凱恩雷吉隆送上祝福2025-08-02 02:17
強硬!FIFA主席 :兩年一屆世界杯可行 年底宣布決定2025-08-02 02:15
Florida hurricane forecast remains uncertain, but trends in state's favor2025-08-02 02:11
詹俊:C羅B費很難攻守兼顧 桑喬和博格巴應該替補2025-08-02 02:03
曝C羅施壓索帥 :每輪聯賽都首發 來曼聯隻為奪獎杯2025-08-02 01:43
利物浦前瞻 :薩拉赫力爭八連殺 馬內衝擊英超百球2025-08-02 01:08
You will love/hate Cards Against Humanity's new fortune cookies2025-08-02 00:53
曝門迪性侵當晚和隊友聚會 警方或要求格10當證人2025-08-02 00:36