时间:2025-11-22 04:12:10 来源:网络整理编辑:休閑
Here's a study supported by the objective reality that many of us experience already on YouTube.The
Here's a study supported by the objective reality that many of us experience already on YouTube.
The streaming video company's recommendation algorithm can sometimes send you on an hours-long video binge so captivating that you never notice the time passing. But according to a study from software nonprofit Mozilla Foundation, trusting the algorithm means you're actually more likely to see videos featuring sexualized content and false claims than personalized interests.
In a study with more than 37,000 volunteers, Mozilla found that 71 percent of YouTube's recommended videos were flagged as objectionable by participants. The volunteers used a browser extension to track their YouTube usage over 10 months, and when they flagged a video as problematic, the extension recorded if they came across the video via YouTube's recommendation or on their own.
The study called these problematic videos "YouTube Regrets," signifying any regrettable experience had via YouTube information. Such Regrets included videos "championing pseudo-science, promoting 9/11 conspiracies, showcasing mistreated animals, [and] encouraging white supremacy." One girl's parents told Mozilla that their 10-year-old daughter fell down a rabbit hole of extreme dieting videos while seeking out dance content, leading her to restrict her own eating habits.

What causes these videos to become recommended is their ability to go viral. If videos with potentially harmful content manage to accrue thousands or millions of views, the recommendation algorithm may circulate it to users, rather than focusing on their personal interests.
YouTube removed 200 videos flagged through the study, and a spokesperson told the Wall Street Journalthat "the company has reduced recommendations of content it defines as harmful to below 1% of videos viewed." The spokesperson also said that YouTube has launched 30 changes over the past year to address the issue, and the automated system now detects and removes 94 percent of videos that violate YouTube's policies before they reach 10 views.
While it's easy to agree on removing videos featuring violence or racism, YouTube faces the same misinformation policing struggles as many other social media sites. It previously removed QAnon conspiracies that it deemed capable of causing real-world harm, but plenty of similar-minded videos slip through the cracks by arguing free speech or claiming entertainment purposes only.
YouTube also declines to make public any information about how exactly the recommendation algorithm works, claiming it as proprietary. Because of this, it's impossible for us as consumers to know if the company is really doing all it can to combat such videos circulating via the algorithm.
While 30 changes over the past year is an admirable step, if YouTube really wants to eliminate harmful videos on its platform, letting its users plainly see its efforts would be a good first step toward meaningful action.
TopicsYouTube
Whyd voice2025-11-22 03:41
滬媒三問質疑李鐵 :談困難頭頭是道 講問題遮遮掩掩2025-11-22 03:21
雪上加霜 !利物浦神鋒受傷 巴西兩將無緣周末聯賽2025-11-22 03:16
半場數據:國足控球率43% 進攻寥寥射門比12025-11-22 03:11
Nancy Pelosi warns colleagues after info hacked2025-11-22 03:08
河北隊領隊:踢足協杯克服了很多困難 差旅費都是教練組墊付2025-11-22 02:05
李鐵 :希望回到中國主場比賽 在中立國踢主場完全不同2025-11-22 02:00
皇馬防線支柱因傷退出巴西隊 恐無緣下周國家德比2025-11-22 01:56
New Zealand designer's photo series celebrates the elegance of aging2025-11-22 01:46
半場數據 :國足控球率43% 進攻寥寥射門比12025-11-22 01:42
Here's what 'Game of Thrones' actors get up to between takes2025-11-22 03:46
國足能輸但請別作繭自縛 踢12強賽不是來了解自己是支弱隊2025-11-22 03:05
搜狐體育獨家連線白國華 :國足繼續輸球李鐵都不會下課2025-11-22 02:55
熱議國足不敵沙特:最差的是中場 李鐵要明確思路2025-11-22 02:24
Airbnb activates disaster response site for Louisiana flooding2025-11-22 02:16
京媒:為何不讓歸化全上呢 ?不多給他們一些合理的機會呢?2025-11-22 02:13
萊萬:我能與哈蘭德共存 瓜帥對我的影響非常大2025-11-22 02:03
吳曦:上半場被動下半場逐漸適應 本有機會拿到積分2025-11-22 02:02
Daughter gives her 1002025-11-22 01:32
帥位穩固!曝曼聯眾星和索帥相處融洽 球隊內部沒問題2025-11-22 01:31